The Information Commissioner’s Office (ICO) has fined Clearview AI Inc £7,552,800 for using images of people in the UK, and elsewhere, that were collected from the web and social media to create a global online database that could be used for facial recognition.
The ICO has also issued an enforcement notice, ordering the company to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems.
The ICO enforcement action comes after a joint investigation with the Office of the Australian Information Commissioner (OAIC), which focused on Clearview AI Inc’s use of people’s images, data scraping from the internet and the use of biometric data for facial recognition.
The ICO states that Clearview AI Inc has collected more than 20 billion images of people’s faces and data from publicly available information on the internet and social media platforms all over the world to create an online database. People were not informed that their images were being collected or used in this way.
The company provides a service that allows customers, including the police, to upload an image of a person to the company’s app, which is then checked for a match against all the images in the database.
The app then provides a list of images that have similar characteristics with the photo provided by the customer, with a link to the websites from where those images came from.
Given the high number of UK internet and social media users, Clearview AI Inc’s database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge.
Although Clearview AI Inc no longer offers its services to UK organisations, the company has customers in other countries, so the company is still using personal data of UK residents.
John Edwards, UK Information Commissioner, said: “Clearview AI Inc has collected multiple images of people all over the world, including in the UK, from a variety of websites and social media platforms, creating a database with more than 20 billion images. The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice.
“People expect that their personal information will be respected, regardless of where in the world their data is being used. That is why global companies need international enforcement. Working with colleagues around the world helped us take this action and protect people from such intrusive activity.