Worldwide protests from data protection authorities apparently don’t irritate Clearview AI: The company wants to one day keep 100 billion personal images in its own database – and make almost all people identifiable.
As a “Google for faces,” Clearview AI is controversial. On the one hand, the search engine equipped with AI image analysis can help solve crimes or find missing people. The company never tires of emphasizing this potential.
On the other hand, Clearview AI could be a kind of Pandora’s box of AI surveillance because it makes it fast, mobile and potentially applicable to everyone. Allegedly, the Clearview app was misused by billionaire John Catsimatidis to identify his daughter’s date.
In a government context, in addition to extensive automated monitoring, there is an additional risk of misidentification if the results of the app are uncritically accepted as truth by those in charge.
Up to 100 billion people images to be added to database
At the heart of Clearview AI is a database of people images that the company gathers from the Internet and links to social media profiles and other information. Using an app, police officers, for example, can scan a face, compare it with the database and find out more about a person in a matter of seconds.
In October 2021, Clearview announced that this database had grown from the original circa three to around ten billion images. In addition, AI-powered image enhancers will optimize the system’s recognition accuracy, for example, through automatic image processing to make images that are difficult to recognize still usable.
The Washington Post has now obtained access to an investor presentation by Clearview AI in which the surveillance company describes plans to add up to 100 billion images to its database – that would be around 13 images per person. The current data acquisition system is said to be able to process up to 1.5 billion images per month.
To that end, Clearview AI plans to raise about $50 million from investors, develop new products, increase international sales, and expand lobbying for “favorable regulation.” The presentation is dated December 2021.
Privacy regulators from Australia, the UK, and France, among others, are currently taking action against Clearview AI. Individual states in the US are also resisting regulatory use of the Clearview app.
The path for those who want to have their own data deleted from the Clearview database is curious: Clearview merely offers an e-mail address to which one should send, for example, one’s driver’s license with portrait photo for identification. Then you request your own profile information with reference to the CCPA or the DSGVO. So Clearview wants more private data to delete private data collected without consent.
Surveillance with the Clearview app: more efficient than in China?
In the presentation, Clearview AI boasts of having little competition, partly because the big tech companies like Amazon and Microsoft are holding back. Moreover, the company’s own product works even better than the state surveillance systems in China: Clearview claims technical superiority by creating its own database from photos including metadata from public sources as well as social links.
Clearview founder Ton-That insists that this data collection is in accordance with applicable law, even if it is done without the explicit consent of the data subjects.
In addition to existing collaborations with numerous government agencies in the U.S. and Europe and the U.S. military, the presentation says Clearview is looking to target private companies, such as for monitoring cheap “gig economy” labor in the retail and e-commerce sectors.
Clearview AI also sees the financial industry as a potential market, although the company claimed in 2020 after the Catsimatidis revelation that it no longer wanted to work with private individuals or private companies.
In the future, “almost everyone in the world will be identifiable,” according to the presentation. Identification could also be based on a person’s gait pattern, or from a finger scan at a distance. Likewise, a person could be located based on a photograph.