Worldcoin and the Inherent Insecurity of Biometric-based Identity Systems
v1: April 5, 2023
Author: Jesse Charlie
Background Information (Skip to main Article if already know)
- (Background info: Public/Private Key Encryption)
When we encrypt messages, we create a key pair: one public and one private. The private key is generated locally, and proving our identity relies on the public key being related to the private key, which is entirely in our control. We can prove our private key is in our control because we generated it on our device. Since our key pair algorithm is based on entropy of randomness, it is very difficult to randomly stumble upon our key through brute force.
- (Background info: Worldcoin's goal)
Problem: The need for proof of personhood on the internet.
What: Proof of Personhood is a cryptographically provable method of proving who you are (public/private key pairs), proving that you are who you say you are (preventing impersonation with false key pairs), and confirming your humanity (distinguishing between humans and robots with unique key pairs).
Why: To differentiate between AI and humans on the internet, distribute Universal Basic Income (UBI) effectively, prevent bots in natural rate limiting (replacing CAPTCHA as AI becomes smarter), etc.
Solution: Many methods can establish proof of personhood. However, Worldcoin argues that biometrics are necessary to create a complete proof of personhood system and prevent fraud.
- Worldcoin's Architecture is based solely on Iris Biometrics:
- An orb (high-resolution camera + DRM hardware) scans your iris.
- The orb creates a hash of the iris (and other eyeball context scanned) and checks if it already exists in the database.
- The orb grants the user a World ID they can use.
- In the future, people can reverify when algorithms improve and restore their identity if lost.
- The Inherent Insecurity
- (We will assume that the algorithm is perfect and every scan does not collide when scanning different irises).
- The database of iris hashes can be fraudulent. We must trust a third party's DRM hardware to attest that the scan is from a human, even if we put scanners on Face ID iPhones. Technically, Worldcoin could add numerous false hashes to the database, rendering the goal of proving a one-to-one correspondence between humans and IDs meaningless. The orb uses liveness detection, but this still depends on the orb and Worldcoin's attestations. (Imagine a government printing many fake IDs for nonexistent people; we trust IDs because they come from a trusted third party (the government) with anti-counterfeit measures.)
- Deriving a cryptographic key pair using biometrics is inherently insecure. Connecting a cryptographic key to biometrics requires a trusted third party. There are two ways this can be done:
- The orb must attest from its hardware signature that it has granted you the right to claim your identity.
- The orb uses your biometric information to create a key pair (requiring trust that the orb isn't copying your key pair before giving it to you).
This differs fundamentally from generating a cryptographic key pair locally on your device. Classical cryptography inserts random bits to the key length, creating a key pair. Biometrics inserts biometric data into the algorithm generating a key pair at best and relies on a trusted third party's hardware attestation at worst. This means that, at best, your identity can be stolen by someone scanning your eyeballs, and at worst, Worldcoin themselves can reset your identity.
Worldcoin fails to create a true proof of personhood protocol because it fails to:
- Prevent fraudulent iris hashes in the database, as we must trust a third party to add new hashes. (Neither blockchain nor reversible functions can solve this problem).
- Securely connect biometric information to identity, either relying on a trusted third party to attest your identity (like a government-issued ID) or using biometric data to generate a key pair (which can be stolen by scanning your eyes or copying the key pair before it leaves the orb).
In classical encryption, someone can impersonate you if they have your private key, which can be derived only by breaking the encryption algorithm, brute forcing your key, or intercepting the hardware that reads the key.
With Worldcoin, someone can impersonate you by scanning your iris (forcibly or surreptitiously) at best, and at worst, Worldcoin themselves can impersonate you. There's no way to connect biometrics to a key pair unless the key pair is generated using biometrics or if a trusted third party attests that your biometrics are tied to your key pair.
Moreover, Worldcoin can create unlimited fake iris hashes, rendering the one-to-one human-to-ID protocol ineffective.
This is concerning because:
- Powerful AI may be gatekept behind World ID, potentially requiring an insecure identity system to access resources that could exponentially benefit users. (Sam Altman is part of Worldcoin and OpenAI, which currently has the most powerful publicly available general AI.)
- Governments could use this system to access facilities and services (mostly outside the US and possibly the EU, since the US would likely create its own system if Worldcoin succeeds).
- Web services and banks could use World ID.
The primary purpose of Worldcoin (biometrics proving that one person has only one ID) fails, necessitating alternative methods without biometrics for creating Proof of Personhood.