Despite the controversy surrounding Polish-based facial recognition software PimEyes, an extensive test of the search engine shows that it has trouble identifying ordinary people.
Of the more than 25 searches performed by DailyMail.com, the AI-powered system had varying levels of trouble with 70 percent of photos, including rotated images or those that are slightly blurred.
Journalists and celebrities seemed to be fairly accurate, but only 26 percent of results were entirely accurate for the average person.
However, this is why security experts deem PimEyes a ‘serious security risk’ – the site provides information to social media accounts.
Some of the matches included URL’s to the individual’s Instagram, TikTok, Tumblr and Facebook, along with personal blogs.
Those looking to ‘stalk’ someone using PimEyes may be able to find their target, but will have to sift through a trove of pornographic images.
Approximately 15 percent of search results popped up with explicit images that link back to the original adult content site.
Scroll down for video
Despite the controversy surrounding Polish-based facial recognition software PimEyes, an extensive test of the search engine shows that it has trouble identifying ordinary people. Most search results were incorrect, except for one from Facebook more than 10 years ago.
Other results had varying levels of success — one return popped up a correct image from Facebook from more than a decade ago, but the rest of the returns were inaccurate.
Only 26 percent of returns were entirely accurate, with no apparent reason, as most of the subjects had social media profiles either sparse or set to private
The site, which uses more than 900 million images to find people, allows people to perform reverse image searches and find pictures around the web that it believes is the same person.
For the free search, it offers a generic site name where the previous picture was found, but notes that it does offer privacy features, including protecting against ‘scammers, identity thieves, or people who use your image illegally.’
Only 26 percent of returns were entirely accurate, with no apparent reason, as most of the subjects had social media profiles either sparse or set to private. Journalists (pictured) and celebrities seemed to be fairly accurate
A search for DailyMail.com journalist Jen Smith pulled up mostly accurate images from Daily Mail links, which all link back to DailyMail content
A paid subscription, which can cost as little as $23.99 per month, provides additional detail, such as the link where the picture was found.
Despite the lackluster results from DailyMail.com’s recent testing, Digital Warfare principal analyst James Knight said the service is ‘very open to abuse.’
‘Although it is marketed for individuals to search for their own image, anyone can search for your image,’ Knight said via email on Friday. ‘The main fear is that this will be used maliciously by stalkers to find more information on you.’
Knight continued: ‘One quick photo taken of someone could be uploaded to reveal potentially hundreds of photos of them, and from there, their name, address, phone numbers, email addresses. This tool is one more level in the eroding of personal privacy.’
Despite offering a free service that uses artificial intelligence and machine learning algorithms, the company’s paid offering goes into further detail, providing the link where the photo came from.
It also allows users to set up alerts that a matching image they uploaded is added to the internet.
Despite the controversy surrounding Polish-based facial recognition software PimEyes, a recent test of the search engine shows that it has trouble identifying ordinary people. Left is the original photo uploaded and right are PimEyes results
Of the more than 20 searches performed by DailyMail.com, it had trouble with 12, including when images are rotated on their side or slightly blurred. Some search results even popped up with pornographic images. Left is the original photo uploaded and right are PimEyes results
A paid subscription, which can cost as little as $23.99 per month, provides additional detail, such as the link where the picture was found – which can be seen on the bottom left of the result images. Left is the original photo uploaded and right are PimEyes results
One premium account allows users — individuals and businesses — to pay $29.99 per month for one-month access and 25 searches per day. A more expensive plan, at $299.99 a month, allows ‘unlimited’ searchers and one-month access.
The company also provides a 20 percent discount if plans are billed annually.
In order for photos to be removed from the site, a person can request that they can be taken down via an online form.
But to truly be removed from the search engine, PimEyes offers a ‘PROtect service,’ for $79.99 a month that lets users have ‘4 hours/month of professional services during which our agent endeavors to remove your photos from the source websites.’
London-based technology researcher Stephanie Hare told The Washington Post that it’s possible for strangers to keep tabs on other people’s lives.
‘What is stopping them? Literally nothing,’ Hare told the news outlet.
‘The people who put those pictures on the Internet – with their children, their parents, the people who might be vulnerable in their life – were not doing it so they could feed a database that companies could monetize,’ she continued. ‘I can leave my phone at home. What I can’t leave is my face.’
ORIGINAL: Pictured is another image DailyMail used to test how inaccurate PimEyes was at finding ordinary people on the web
RESULTS: Although the results do not show the man in the original, PimEyes was able to produce images with people wearing masks
DailyMail.com has reached out to PimEyes with a request for comment.
PimEyes has been used by people on 4chan and other websites that have paid subscriptions to ‘stalk women,’ creating threads of them, according to The Post, citing an interview with Aaron DeVera, a security researcher in New York.
PimEyes has also come under fire in the past for potentially violating the European Union’s General Data Protection Regulation (GDPR) usage of facial recognition.
Daily Mail reporters tested the site and found it had limited and temperamental success for a photo of a member of the general public. Pictured, two images that were uploaded. The selfie on the left yielded no matches. The image on the right, from Facebook, did create matches
FAIL: Daily Mail tested the site to see how effective it was at finding non-famous people. When a selfie was put into the system, it found no correct matches (pictured)
MATCH: When a Facebook profile picture was used and not a new selfie, the site found more pictures which are available online (pictured, the results). However, it is worth noting that two of the results included the exact same image as the profile photo (top left)
However, the company says PimEyes was created as ‘a multi-purpose tool’ that allows people to track down their face on the internet, reclaim image rights, and monitor their online presence.
That means if a person inputs a picture of a stranger to the site and sets up an alert for their face, the system will let the customer know when it finds this stranger in a picture posted to Twitter, for example.
PimEyes premium allows people to set up to 25 alerts. These could be used for 25 photos of a single person, to increase success odds, or be photos of 25 people.
Dave Gershgorn recently reported for OneZero that the site also offers contracts to law-enforcement agencies.
When the Twitter display photo of MailOnline health editor Stephen Mathews (pictured) was fed into PimEyes, it found correct matches
He reports that the facial recognition software has the ability to scan ‘darknet websites’ as well as the surface web, and says that PimEyes is incorporated into the app of at least one company, called Paliscope.
Paliscope is targeted at law-enforcement agencies to allow them to use facial recognition on images and files, to help identify suspects in a case.
Paliscope recently partnered with 4theOne Foundation, an organization which is dedicated to finding trafficked children.
PimEyes claims to allow people to protect their privacy and prevent the misuse of their images, but there are no safeguards in place to ensure a person uploads their own face.
MATCH: Pictured, the first page of PimEyes results when health editor at MailOnline, Stephen Matthews, used his Twitter profile picture. It correctly matched his face to more than ten results, however six of the results included the exact same photo (pictured)
MATCH: Pictured, more results for Stephen Matthews (pictured). These matches are the correct person but they are all variations of two individual photos — smiling after receiving a journalism award and a front-on picture used in another MailOnline article. It did not find any surprising or hidden pictures
There is nothing to prevent people uploading a photo of a stranger taken without permission and scouring the web for more details on their life.
The ability to seek out more images and information on a person from a single photo poses dangers to all manner of people.
PimEyes is not the first site to give the public access to facial recognition technology and search for a person’s face online.
In 2020, Russian search engine Yandex was accused of providing an unregulated facial recognition system and violating personal privacy.
Yandex, which claims to conduct more than 50 per cent of Russian searches on Android, allows users to input an image and see results of the exact same person.
Another Russian company, called NtechLab, launched FindFace in 2016. This functioned in a similar way to PimEyes and was extremely popular.
It was eventually shut down and the tech adapted for state surveillance. The company’s founders said at the time that they built the technology to help women find someone to date.
The photograph of Piers Morgan holding up a sausage roll yielded several different results of the journalist and TV personality from elsewhere on the internet (pictured)
Last year, Comparitech security specialist Brian Higgins told MailOnline that PimEyes poses ‘very serious and obvious privacy implications’.
‘I seriously doubt the developers are so naïve as to think it would only be used for its designated purpose,’ he added.
‘Unfortunately, the internet is not governed by any blanket privacy protections. Every site or platform will ask users to accept its Terms and Conditions.
‘These are almost always many pages long and will include reference to “image ownership” which often allows the provider rights to storage and use of any images.
‘The FindFace T&Cs were a prime example of this, yet global uptake was swift and substantial.
‘Unfortunately, nobody reads the Terms and Conditions, thus allowing platforms like PimEyes to leverage new and developing technologies to provide potentially unethical products and services.
‘The only solution to the problem is for individual internet users to take their own privacy more seriously and take steps to protect themselves.’
As recently as June 2020, MailOnline reporters tested the site and found it had limited and temperamental success for a photo of a member of the general public.
When used on celebrities however, the site was, unsurprisingly, far more effective.
Images of Amanda Holden, Piers Morgan and Boris Johnson all yielded hundreds of correct matches, showing the effectiveness of the technology for a face that is regularly in the public domain.
It appears to be more effective on high-resolution images already online, as opposed to new selfies that have not been shared before.
Pictured, the image of MailOnline health reporter Connor Boyd which was inputted into the PimEyes system
This ability differentiates it from mainstream search engines such as Google, which do not use facial recognition in image searches.
Instead, Google’s technology looks for similar features such as attire and surroundings.
For example, if a white man with brown hair wearing a blue tie in front of a white background uses Google Image search on a selfie, it will provide images that match the description. It does not use data from a person’s face.
PimEyes however, pulled up images where I was wearing a variety of clothes in different lighting and surroundings, indicating it does indeed use facial recognition.
Health editor at MailOnline, Stephen Matthews, used his Twitter profile picture and found the site was moderately successful.
MailOnline health reporter Connor Boyd also tested the PimEyes technology. After inputting a clear image, the software was able to correctly identify just one other image of Boyd. This image — a five-year-old holiday photo (top left) — was surprising as it is a photo with very little prominence online or on social media
It correctly matched his face to more than ten results, however six of the results were the exact same photo on different domains.
The other results were variations of two individual photos — smiling after receiving a journalism award and a front-on picture used in another MailOnline article. It did not find any surprising or hidden pictures.
MailOnline health reporter Connor Boyd also tested the PimEyes technology.
After inputting a clear image, the software was able to correctly identify just one other image of Boyd.
This image — a holiday photo from 2015 — was surprising as it is a photo with very little prominence online or on social media.
However, more readily available images of Boyd were not identified by the software, despite his face appearing in past articles on MailOnline, one of the world’s most read websites. All other results were incorrect.
For celebrities and well-known people like Tom Brady, PimEye does bring up accurate results
WHAT CAN USERS DO TO PROTECT THEMSELVES FROM FACIAL RECOGNITION MISUSE ONLINE?
Felix Rosbach, product manager at German software developing firm comforte AG says there is only so much an individual can do to protect themselves from this tech.
He told MailOnline: ‘As a private individual the only thing you can do to protect your data, is to make sure your social media profiles aren’t publicly available and by only sharing data with trusted parties.’
But Mr Rosbach adds that there is unfortunately very little users can do if peers post pictures of you publicly.
He calls on the search engines themselves to ensure members of the public are protected.
He said: ‘Search engines should make sure that these functionalities can’t be misused.
‘But with machine learning software becoming broadly available, there will always be a page or an app that is able to offer this service.
‘Companies instead should and can make sure that sensitive consumer data is protected all the time.
‘And it’s not only about securing the access to data – it’s also about strong data protection to make sure data is useless in case of a data loss, a misconfiguration or a data breach.’