The privacy narrative is noisy. We've channeled the signal.
The 2025 DC Privacy Summit unearthed the pieces of a complicated puzzle.
Hello again! In case you missed it, our 2nd annual DC Privacy Summit was earlier this month. At the risk of being immodest, it was awesome—we tapped into a deep, urgent conversation around cryptographic privacy thanks to an amazing lineup of speakers. Our attendees were super engaged, with great questions throughout the day.
Below, you’ll find a roundup that’ll catch up anyone who wasn’t able to attend. And to the large chunk of folks who were there and signed up for the newsletter, welcome! Consider this your Cliff’s Notes from the day.
Let’s get more precise about privacy
The story of cryptocurrency and privacy has mostly been the same since day one: as long as the public ledger is transparent, the money is traceable. As long as the money is traceable, the state (and possibly your adversaries) will trace it. To truly be a money system independent of state control, it must be both uncensorable and private. This is nothing new. Hence Zcash, Monero, Tornado Cash, Samourai Wallet, and more recent projects like Railgun and 0xbow—all of which use cryptographic techniques to achieve some level of privacy onchain.
Recently, however, the story has begun to change. The rise of shockingly powerful data processors like large language models, the overt aggregation of government databases to juice surveillance powers, and massive thefts of personal data by nation-state hackers are just a few of the reasons for that. More people, even in historically safe and stable places like the US, are looking to better protect their data.
Combine that with an administration in Washington positioned as “pro-crypto” and it helps explain why so many influential folks in crypto social media spaces have been posting so much about privacy lately. Not only is it timely, but the subject is no longer widely seen as taboo. (Coincidentally or not, the price of Zcash, the first practical implementation of zero-knowledge proofs, has also been pumping.)
But social media, especially crypto social media, is mostly noise.
To find the signal, you’ll have to dig beneath the shitpost-covered surface. Talk to the folks giving themselves headaches from thinking so hard about how to balance onchain privacy with the risks posed by bad actors. Ask the regulators who see the promise of emerging cryptographic privacy tools, but who also see first-hand how the world’s most sophisticated hackers use them to evade law enforcement. Push the conversation beyond the usual absolutism into a grey area full of technical possibilities and ideas, but devoid of concrete plans for how to move forward.
That’s what we did on October 16.
The 2nd Annual DC Privacy Summit highlighted the many facets that should comprise an urgent, pragmatic conversation about cryptographic privacy in 2025. Interested in participating in this conversation? We recommend you close Twitter for a while and immerse yourself in the Privacy Summit sessions we describe below, in the order in which they occurred on the day of the event.
Read the summaries. Watch the videos. Watch them again. And then let’s keep talking about privacy for real. The narrative on social media has nothing on this stuff.
You are probably underestimating modern cryptography. Arnaud Schenk, executive director of the Aztec Foundation, noted in his opening remarks that until recently, cryptographic tools were mostly binary—fully private or fully transparent. That’s no longer the case, he said. Tools incubated in the blockchain space over the past decade can achieve much greater flexibility. These tools will only get more powerful and easier to use, and they are bound to shake things up considerably, he said:
“By definition, they break a bunch of assumptions that I think a lot of regulators, a lot of companies, have about how the world works or ought to work. They give people new powers, and they create new risks. They will make some regs completely counterproductive, and they will make new regs potentially needed.”
A privacy forest fire is coming. Johns Hopkins University cryptographer Matthew Green has a dire warning: that paranoid thought in the back of your mind that somewhere someone knows everything about you, right down to your individual Google searches, will not just be a paranoid thought for much longer. His preferred analogy is a forest fire:
The dry timber is the vast amount of data that governments and companies have already collected and are now storing.
The accelerant is the current push by governments to weaken encryption and bind human identities to online interactions using mobile driver’s licenses and other digital credentials. “In the past, we’ve survived all of this data collection because we had limited human capacity to process data,” Green said.
That’s not the case anymore, due to the fire-starter, he argued: emerging capacities to do machine learning inference at massive scales.
It’s not all bad news; there are still things we can do to keep the whole forest from burning to the ground. But it’s late in the game. (Watch Matthew Green’s talk)
Technology is not a privacy cure-all. “When we talk about privacy today, the conversation inevitably quickly turns to technology,” Neha Narula, director of the MIT Media Lab’s Digital Currency Initiative, said in her keynote. But we need to be precise about what these technologies can and can’t do. Take zero-knowledge proofs:
“They let you show that a computation was carried out correctly, without revealing the underlying data, and that’s remarkable. But they don’t tell you if the data itself is accurate or complete, and they don’t preclude the need for an authority to source that data, like a government indicating citizenship. The mere use of verifiable credentials can’t prevent an authoritarian government from denying some of its citizens that credential. And it can’t stop a platform from using that credential to kick certain types of people off.”
Narula is also concerned that if used unwisely, zero-knowledge proofs and other tools could replace autonomy with, as she put it, “automated control.” Imagine having to prove you are credit-worthy before you can call an Uber, or prove your health status before being allowed into a building. “A technology that was originally designed to help with privacy turns into continuous permissioning.” Technology itself is not enough; technologists must figure out how to work within policy and the law, Narula said. (Watch Neha Narula’s talk)
Human dignity still matters. After her talk, Narula was joined on stage by SEC Commissioner Hester Peirce. They discussed a speech that Peirce gave in August that highlighted the promise of zero-knowledge proofs for achieving greater financial privacy. It’s significant that a regulator is talking about this stuff in such detail. But is it really possible to change entrenched mindsets in DC around anti-money-laundering and financial data collection?
If so, it will be by appealing to our shared dignity, Peirce said.
“We don’t want to become like the bad guys in the process of going after the bad guys, right? So we want to make sure that we preserve what this country is about, which is the dignity of every person. That’s the key. And what does that mean? Dignity—part of what is dignity—is that you get to choose who you want to spend your time with and who you want to share intimate details of your life with.”
The way the government relies on the private sector to collect and access Americans’ financial data “doesn’t accord with that fundamental principle that binds us together,” she said. (Watch the fireside chat featuring Neha Narula and Commissioner Hester Peirce)
DeFi is critical infrastructure. Whether or not DeFi is important to you, it’s undeniably important to the North Korean regime, which has exploited crypto networks for billions of dollars via its state-sponsored hacking group known as Lazarus. It was Lazarus’s use of Tornado Cash that sparked the crackdown leading to the criminal convictions of developers Roman Storm and Alexey Pertsev. The challenge of countering sophisticated state actors in cyberspace is not new in finance, attorney Michael Mosier noted during a panel that also featured Casey G., the CEO of zeroShadow, and Samczsun of the Security Alliance (SEAL). Mosier, a former director at FinCEN and at the National Security Council, has seen firsthand how the traditional financial system uses public-private partnerships to respond to incidents and share threat intelligence. However, he noted:
“The networks that we are talking about now are different.”
The three panelists described a huge amount of work that must be done on both the technical and legal sides to build the right “pipes” for sharing crypto threat intelligence effectively. (Watch the panel featuring Michael Mosier, Casey Golden, and Samczsun)
The Tornado Cash and Samourai Wallet prosecutions are chilling development. Michael Lewellen wants to build a smart contract-based crowdfunding protocol that would use zero-knowledge proofs to let donors remain anonymous. But he’s worried that he might get prosecuted for unlicensed money transmission—the same crime that Tornado Cash developer Roman Storm was convicted of, and Samourai wallet developers Keonne Rodriguez and William Lonergan Hill pled guilty to. As we’ve discussed at length in this newsletter, these prosecutions by the Department of Justice have contradicted something Treasury’s FinCEN said in 2019: if a software developer never takes control of user funds, they are not a money transmitter. Lewellen sees the Tornado Cash and Samourai prosecutions as contrary to the rule of law, so instead of building his protocol, he is suing the DOJ.
“With my lawsuit, I would like to have a definitive answer from the courts saying ‘No, they cannot do this anymore.’”
(Watch the fireside chat with Michael Lewellen)
A new approach to anti-money-laundering is possible. There’s a good argument that traditional approaches to anti-money-laundering (AML) and know-your-customer (KYC) processes are outdated. Either way, these established approaches don’t fit decentralized systems. Coin Center’s Peter Van Valkenburgh called on the audience to imagine:
“A better world, where new technologies like verifiable credentials, zero knowledge proofs, and open blockchains are used in alternate modes of AML where users own their own identity credentials, where they don’t need to repeatedly provide them unencrypted to every institution where they open an account, where they can selectively prove discrete facts about themselves, and where these proofs and credentials can be rapidly composable into effective but minimally invasive risk scores to address evolving money laundering threats without immediate de-banking for innocent folks and hopefully with fewer naive false positives from overtly simplistic identity practices.”
Van Valkenburgh and Coin Center have launched the John Hancock Project to help usher in this new world. (Watch Peter Van Valkenburgh’s talk)
Onchain privacy is like nuclear physics. Credit for that analogy goes to Wei Dai, a cryptographer at 1kx, who shared it during his keynote.
“It is a dual-use technology that can do great good for the world, but also can be very dangerous. For those that design and help shape these protocols, we need to be conscious of the potential risks with these protocols.”
Technologists face what Dai called a “trilemma”: privacy protocols can not simultaneously achieve perfect privacy, threat-resistance, and “maximum usefulness.” It’s possible to impose technical measures that reduce risk while sacrificing usefulness. Another approach, he explained, is to sacrifice perfect privacy for threat-resistance, by making it possible for certain entities to view certain parts of the transaction record under certain circumstances. This gives users the freedom to dissociate from other, malicious users, he argued. (Watch Wei Dai’s talk)
Ready or not, zero-knowledge crypto is going mainstream. Google has made it possible to use its wallet application to prove, using a zero-knowledge proof, that you are over the age of 18. It doesn’t get any more mainstream than that. And it’s just the start. As Northeastern University cryptographer Abhi Shelat and a16z research partner Justin Thaler explained during their panel, the technology is already capable of making age verification more private—and will soon be able to run all kinds of identity-related checks while maintaining user privacy. But Thaler has a warning:
“These protocols are orders of magnitude more complicated than today’s digital signature schemes or encryption schemes. And so they are full of bugs; we’ve got to be careful until we develop some confidence that there aren’t bugs there.”
(Watch the panel featuring Abhi Shelat and Justin Thaler)
The EU digital wallet initiative is a cautionary tale. How can we avoid dystopian digital ID systems? A panel featuring Amal Ibraymi from Aztec, Aisling Connolly from TACEO, and independent applied cryptographer Ying Tong grappled with that question. A particularly pressing real-world problem involves the EU’s digital wallet project, which offers a cautionary tale for other governments, Ying Tong said.
“The EU Commission themselves had set forth really strict requirements around unlinkability stating that issuers and verifiers should not be able to link individual presentations of a credential. Yet this did not prevent them from picking a solution that failed to comply with their own requirements. If they had done a careful threat modeling and if they done a careful examination and taxonomy of use cases, and if they had involved technical experts earlier on, this could have been avoided.”
(Watch the panel featuring Ying Tong, Aisling Conolly, and Amal Ibraymi)
AML is more than static checks. The GENIUS Act has provided an opportunity to reimagine AML for digital assets (whether or not anything comes of it). But verifiable, anonymous credentials that use zero-knowledge cryptography represent only one piece of the puzzle. Many of the ideas out there revolve around “static things” like “prove, at this time, that I’m not a North Korean. Prove I have a US passport, prove whatever,” noted Ian Miers, who joined fellow panelists Laz Pieper of the DeFi Education Fund and Ross Schulman of SpruceID to talk about what the future of AML could look like. Miers continued:
“None of the history of computer attack and defense or anti-money-laundering is a static system. You have to be able to react and adapt because any given tactic you pick, they’re going to adapt their tactics, techniques and procedures to get around it. That’s reality. It’s a cat-and-mouse game. That’s the game that FinCEN plays. That’s how they go to banks and say ‘Look, your AML flags aren’t up to date, you need to update them.’”
Miers argued that the goal should be a system in which users own their data and use it to calculate a “dynamic risk score” that institutions can use for AML and which can be constantly updated. (Watch the panel featuring Ian Miers, Laz Pieper, and Ross Schulman)
Be more precise about what you mean by privacy. Are you talking about pseudonymity, confidentiality, anonymity, or total privacy? What exactly are you hiding, and from whom? Why? For payroll, where it’s OK for employees to see each other get paid, keeping the amounts confidential might be enough. That’s the “vanilla” flavor of privacy, said Inco founder Remi Gai, who joined Predicate CEO Nikhil Raghuveera for the final panel of the day. If you don’t want everyone to be able to see who you are paying, anonymity is in order. “Total privacy”—hiding the sender, amount, and recipient—might be needed if you want to hide the movements of certain “sensitive assets,” Gai said. The flavor of privacy your system features will depend on the cryptographic techniques you use. Besides zero-knowledge cryptography, which can be used to prove things about the user, other methods like multiparty computation (MPC), fully homomorphic encryption (FHE), and trusted execution environments (TEEs) can be used to compute on encrypted data, for example, to produce risk scores. Compliance is extremely complicated—but blockchains are up to the challenge, Raghuveera said.
“The existing financial system is not programmable. That is actually something that blockchains have a massive edge on.”
(Watch the panel featuring Remi Gai and Nikhil Raghuveera)



