Scanning your iris to become a “verified human” in exchange for digital currency looks like a black mirror episode. But this isn’t the story arc of a dystopian sci-fi show – it’s happening now with one of Web3’s latest projects.
The launch of Worldcoin on Optimism left many wondering if this project reversed Web3’s decentralization promise to build the exact opposite. And yet, over 2 million people in underserved areas have already signed up to share their biometric data with Worldcoin in exchange for 25 WLD, worth less than $100 at the time of writing.
This is not only weird, it also presents serious privacy risks and creates a honeypot for bad actors. More so, there is an argument to be made that it might even interfere with the sovereignty of foreign countries.
Why would we need Worldcoin in the first place?
Worldcoin was founded to solve the expected externalities of its sister company, OpenAI – the creator of ChatGPT and other popular AI products. One hand solves the problems the other creates.
2024: Bitcoiners try to save children from Sam Altman’s WorldCoin orb eye scans pic.twitter.com/ZeAWuNdSJs
— ₿it₿ry (@bchinella) July 26, 2023
In the words of its founders: “If successful, we believe Worldcoin could significantly increase economic opportunity, scale a reliable solution to distinguish humans from AI online while preserving privacy, enable global democratic processes and possibly show a potential route to AI-funded UBI. [universal basic income].”
The problem with Worldcoin
Despite the ambition and promise to protect privacy, a whole new set of problems arises from the fact that this is done by a single, currently centralized company. The irony is not lost on ChatGPT. Some of his responses when prompted “What are the risks of having a company owning biometric data for individuals in underdeveloped countries?” include:
- Privacy Breaches
- Security failures
- Surveillance and sovereignty
Ethereum co-founder Vitalik Buterin also echoed some of these concerns.
The fact that a company possesses the biometric data of individuals in underdeveloped countries presents significant risks for individuals. On a broader societal scale, these are even more important when associated with UBI payments to foreign citizens.
Biometric data like irises are very sensitive and unique to each individual. It can reveal information such as gender, ethnicity and, possibly, medical conditions. If only one company controls this data, there is a high risk of a privacy breach, as it can be used to track and monitor individuals without their consent.
Related: The world could face a bleak future thanks to CBDCs
Who is to say that the company would not exploit the biometric data for commercial purposes, such as targeted advertising or the sale of the data to other entities? Isn’t that diametrically opposed to what we’ve been trying to achieve for a few years?
The centralization of biometric data also exposes it to a higher risk of being the target of hackers and cybercriminals. This is known in the security industry as a “honeypot” when used for controlled purposes. A large amount of attractive data is stored by a single entity in order to investigate a possible breach implying that it will eventually be hacked.
Related: CBDCs will lead to absolute government control
A data breach on this scale could lead to serious consequences, including identity theft, fraud and unauthorized access to the personal information of millions of people.
Surveillance and sovereignty
This data could also fall into the hands of governments to subpoena the data and obtain personal information from citizens without a warrant. There are fewer protections when you sell your data to a third party. A corrupt government can use this data to manipulate behavior, limit dissent and suppress opposition, essentially turning underdeveloped regions into surveillance states.
More so, if the company operates across borders, it could wield undue power and influence over governments and corporations. Financially supporting large numbers of foreign citizens under a universal basic income model could ultimately reduce the autonomy and sovereignty of a country’s democratic processes.
When visiting Worldcoin orbs to scan their irises, registrants receive a promotional sticker that reads “Verified Human”. There’s a slight sense of discomfort just being called human not here person.
In the context of selling your identity for a few bucks to a cryptocurrency project tied to AI development, that almost sounds like a Freudian slip. It’s like personality is a forgotten idea, and now we’re just humans in a massive database of biometrics.
Sometimes reality really is stranger than fiction.
Matthew Niemerg is co-founder and board member of the Aleph Zero Foundation. He holds a doctorate. in Mathematics from Colorado State University and is currently an expert with the EU Blockchain Observatory and Forum. He is also co-founder of Cardinal Cryptography.
This article is for general informational purposes and is not intended to be and should not be considered legal or investment advice. The views, thoughts and opinions expressed herein are those of the author alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.