Worldcoin’s struggle for hearts, minds, and irises
Ready or not, the orbs are pushing the envelope
Happy September! Sorry we haven’t been around as much recently—we’re working on something big and we’re NEARLY ready to talk about it. It’s coming! In the meantime, let’s take a few moments to consider the implications of iris-scanning crypto orbs.
Look into the orb
In at least one important way, the conversation about Worldcoin is misplaced in time. The system’s primary purported use case—to distinguish humans from artificially intelligent entities online without compromising those humans’ privacy—solves a problem most of the world isn’t concerned about yet.
Still, the arguments that the iris-scanning cryptocurrency project is having with governments across the globe over data privacy—several nations have banned or suspended the project and a handful of others are investigating it—are not the stuff of science fiction. They reflect how the conversation about trust and privacy online is poised to change with the rise of novel cryptographic systems.
The latest flashpoint is in Colombia, where regulators have filed a "statement of charges" against the Cayman Islands-based Worldcoin Foundation, which calls itself the “steward” of the Worldcoin protocol, and Tools for Humanity, the San Francisco-based company in charge of developing the technology, for “alleged violations“ of the nation’s regulations for personal data collection. Those rules prohibit the processing of personal data—defined as “any operation or set of operations on personal data, such as the collection, storage, use, circulation, or deletion”—without explicit consent from the owner of the information. Companies must be able to explain exactly what the data will be used for and only use it for that purpose.
Worldcoin, best known for its chromed-out iris-scanning “orbs,” launched in Colombia in May. A day later, the country’s Superintendency of Industry and Commerce (SIC) opened an investigation. Worldcoin, the regulator said, had not yet “demonstrated scientifically or technically in other jurisdictions or in Colombia that (its iris-scanning process) does not involve the collection of sensitive personal data.”
The “World ID”
Worldcoin’s orbs capture images of new users’ faces and irises. According to the project’s whitepaper, each orb converts the images into “iris codes.” (Worldcoin describes an iris code as “a numerical representation of the texture of a person’s iris.”) When a new person signs up, Worldcoin uses their iris code to check their “uniqueness.” A “uniqueness-check service” verifies that the code is different than any code that it has seen before. Then it updates the resource database with the new code and adds a public cryptographic key associated with the new user to a smart contract on the Ethereum blockchain.
An iris code contains no personal information—there is no known way to reverse engineer it and get biometric data, according to Worldcoin. Crucially, the system does not use the iris scans or codes to authorize transactions, as Johns Hopkins University cryptographer Matthew Green noted last year in an examination of Worldcoin’s documentation. That would be bad, he said, because it would mean iris scans could be used to steal money.
Worldcoin says it only uses the iris code to check a new user’s “uniqueness.” After that it stores the code in its database. But when someone uses their “World ID,” they don’t use their iris code. They use a zero-knowledge proof, which is a cryptographic statement that provides no other information other than what is necessary for verification—in this case, confirmation of a user’s uniqueness. “This means no third-party will ever know a user’s World ID or wallet public key, and in particular cannot track users across applications,” the white paper states. “It also guarantees that using World ID is not tied to any biometrical data or iris codes.”
The project also has an advanced method for securing the iris codes, using an emerging cryptographic technology called secure multiparty computation. This splits the code into “multiple different secret shares that are stored and encrypted across multiple secure databases.” The code can only be reconstituted if all of the shares are combined. “Even a complete security breach of some participants would not leak the secret, as long as at least one of the participants is secure,” Worldcoin explained in a blog post.
Meanwhile, the original iris and face images are “packaged, encrypted, and ‘signed’ by the Orb to ensure authenticity and security, then sent to temporary backend storage for transit before the Orb deletes them,” explains an FAQ on Worldcoin’s website. “Importantly, the backend cannot decrypt your data package.” Worldcoin claims that no “raw biometric data” ever leaves the device without the owner’s consent. But it does give users the option to let Worldcoin use the iris scans to train its algorithms.
“This sparks fear”
According to the Wall Street Journal, some regulators have alleged that despite what it says about biometric data never leaving the orb, Worldcoin has in some cases trained orb operators to encourage users to share their raw iris images. Some critics have accused the project of exploiting people who may not understand the technical implications of this decision. “It’s taking advantage of people’s lack of sophistication to train very advanced and novel technology,” Calli Schroeder, a senior counsel at DC-based Electronic Privacy Information Center, told the WSJ.
It hasn’t helped that over the past few years, Worldcoin has developed a reputation for using deceptive marketing and recruiting processes. It has also faced criticism for dangling small cryptocurrency enticements to enroll people in nations like Argentina, where many people are facing economic challenges.
In March, Spanish regulators banned Worldcoin for three months after receiving complaints that it was illegally collecting data from minors and not providing enough information to users about its data practices. In May, Hong Kong’s Privacy Commissioner ordered Worldcoin to stop all operations there after finding that it was collecting “unnecessary and excessive” biometric data and that users were not properly informed of the purpose of the data collection and “whether it was obligatory or voluntary for them to supply their personal data.”
Damien Kieran, Worldcoin’s chief privacy officer, told the WSJ that Worldcoin has temporarily stopped giving users the option to share their iris images while it develops a new process.
Worldcoin is aware of its image problem. The investigation in Colombia is proceeding despite Tools for Humanity hiring Orza, a lobbying firm, in hopes of establishing a good relationship with authorities well before the May launch date. Gonzalo Araujo, Orza’s co-founder, told Rest of World that his team wanted to “demystify the international cases or the supposed prohibitions of operations in some markets, because these things make a lot of noise.”
The company met with the regulators before the launch to “explain how this technology works and what we do in privacy and security,” Martín Mazza, Tools For Humanity’s Latin America manager, told Rest of World.
Those meetings don’t seem to have had the desired effect.
Perhaps that’s in part because the technologies Worldcoin is working with are so novel and complex. Despite the project’s assurance that it can be trusted to implement sophisticated “privacy-enhancing” technologies, regulators in Europe still have questions about how Worldcoin is securing the biometric data it stores. Michael Will, a German regulator leading an EU inquiry into Worldcoin told the WSJ that his team is focused on “ensuring iris codes and images are secure, given that biometric data can’t be altered and any breach could lead to identity fraud.”
“Zero-knowledge proofs are famously difficult to get right,” Green wrote in his Worldcoin writeup last year. Small vulnerabilities “can create a huge privacy leak,” he added. But while Green expressed skepticism about Worldcoin’s long-term intentions, he was “pleasantly surprised by the amount of thought that Worldcoin has put into keep(ing) transaction data unlinked from its ID database.”
Nonetheless, it could also be that regulators are skeptical or suspicious at least in part because they don’t understand what Worldcoin is for. Maybe the idea that distinguishing humans from AI is an urgent problem strikes them as science fiction. Without a clear idea of why people would use it, can policymakers be blamed for assuming this is another Silicon Valley company trying to exploit its users?
Araujo, Worldcoin’s lobbyist in Colombia, said this to Rest of World: “When you don’t have a specific use case, when you can’t tell authorities ‘this is our business model, and this is what we do,’ this sparks fear.” —Mike Orcutt
ODDS/ENDS
Pavel Durov, founder of the messaging app Telegram, has been charged in France with various crimes related to his not intervening to prevent illicit activity on the platform. According to the New York Times, Durov faces charges of “complicity in managing an online platform to enable illegal transactions by an organized group” in addition to complicity in the distribution of child sexual abuse material, drug trafficking and fraud, and refusing to cooperate with law enforcement. We’ll be watching this one closely.
Many in Brazil now feel “disconnected from the world” after a judge there suspended the platform for refusing to name a legal representative to the country. Bluesky has seen 200,000 new users from Brazil since the ruling, according to the AP.
Nearly half of all corporate contributions to US federal elections in 2024 have come from crypto corporations. According to a new report from Public Citizen, $119 million of the $248 million in corporate cash contributed thus far has come from the crypto industry, with most of it ($114 million) going into the coffers of Fairshake, a pro-crypto Super PAC. Americans for Prosperity Action, primarily backed by Koch Industries, is a distant second with $26 million received. More than half of Fairshake’s “came directly from corporations that stand to profit from the PAC’s efforts, mostly Coinbase and Ripple,” the report reads. “The rest of the PAC’s funds mostly come from billionaire crypto executives and venture capitalists, including $44 million from the founders of the venture capital firm Andreessen Horowitz, $5 million from the Winklevoss twins, and $1 million from Coinbase CEO Brian Armstrong.”
Meanwhile, Fairshake is ruffling feathers on both sides of the aisle. The Super PAC has targeted “anti-crypto” candidates, regardless of party, and that has made some crypto supporters unhappy. Republicans are upset that Fairshake backed Democratic Senate candidates in two battleground states, Arizona and Michigan. Meanwhile, tech billionaire and Democratic donor Ron Conway, who donated $500,000 to Fairshake in December, is furious that the Super PAC is backing the opponent of Sherrod Brown, a powerful Democratic Senator from Ohio, another battleground state.
Conway’s outburst came on the heels of a pledge by Senate Majority Leader Chuck Schumer that he would try to get crypto legislation through the Senate before the end of this year. The comments came during a “Crypto4Harris” town hall last month. “Sadly there are a lot of members in Congress nowadays who built their political brands around creating spectacle and sensationalism instead of putting in the hard work of legislation,” Schumer said. “Crypto is here to stay no matter what so Congress must get it right.” He did not say which specific legislation he was talking about.
Contrary to the Trump-written GOP platform, the recently unveiled Democratic platform contains no mention of crypto.
A Trump campaign disclosure revealed that he owns up to $5 million in crypto and has earned more than $7 million selling NFTs.
NFT marketplace OpenSea has received a so-called Wells notice from the SEC “threatening to sue us because they believe NFTs on our platform are securities,” CEO Devin Finzer said on Twitter/X. He said the move threatened to “stifle innovation” and could harm “hundreds of thousands of online artists and creatives.”
The Optimism Foundation has temporarily disabled the permissionless “fault proof” system it deployed in June. The move came after security audits revealed security vulnerabilities that require fixes. According to OP Labs, the software development company building Optimism’s technology, none of the vulnerabilities were exploited and user assets “are not and were never at risk.” Implementing the permissionless system let Optimism claim it has achieved “Stage 1” decentralization according to the informal rubric formulated by Ethereum co-creator Vitalik Buterin. Back to Stage 0 for now.
Fabric Cryptography, which is developing chips customized for cryptography, raised $33 million in venture capital funding. The company is developing what it calls a “verifiable processing unit,” which will be designed to handle cryptography-intensive applications.
California’s legislature has passed a controversial new bill that would introduce restrictions on AI development in the interest of public safety. Although debate over the bill has featured the typical arguments over whether new rules might “stifle innovation,” the politics of the bill are unusual. Elon Musk has signaled his support. Nancy Pelosi has called the bill “well-intentioned but ill-informed.” Governor Gavin Newsom has until September 30 to decide on whether to sign it into law, but he’s not yet revealed his position. According to The New York Times, if Newsom does sign it, “the state will become the standard bearer” for regulating AI.
Condé Nast and OpenAI have agreed on a deal that will let OpenAI use content from Condé Nast’s publications. That includes the Wired, the New Yorker, Vogue, Vanity Fair, and Bon Appetit.
“Telegram and Signal are very different applications with very different use cases.” That’s Signal president Meredith Whittaker to Wired’s Andy Greenberg, part of a long and wide-ranging Q&A in the magazine to mark the encrypted messaging app’s 10th anniversary. “Telegram is a social media app that allows an individual to communicate with millions at once and doesn’t provide meaningful privacy or end-to-end encryption,” she said. “Signal is solely a private and secure communications app that has no social media features.”
Follow us on Twitter or get corporate with us on LinkedIn—if you want.