In April 2024, PlasBit approached me with a writing offer. The work seemed like standard cryptocurrency writing fare — write a couple sentences on crypto prices and sprinkle in some info on crypto tech. It’s only when I saw the PlasBit manifesto that I became intrigued. The manifesto mentioned privacy and human rights while emphasizing cypherpunk values that seemed like something I’ve been looking for my entire life. I had been vaguely familiar with the cypherpunk movement, but even after the first couple of articles, I was still wondering, What are the values of cypherpunk? In short, the values of cypherpunk are:
- Private digital transactions.
- Private digital communication.
- Avoiding government surveillance.
Those values are based on convenient encryption methods that prevent surveillance and tampering with user data.
That answer blew my socks off, because it helped me address a pain point that I always felt but never dared put my finger on. I was always painfully aware of various guardian figures who claimed to know better than me and should, therefore, decide what my life should look like. Their only requirement was that I relinquish my privacy and reveal to them my innermost desires, fears, and dreams. For me, that was a line I could never cross and a price I would never pay. No matter what, I’d struggle to achieve the kind of life I want to live, one where I have personal autonomy and the freedom to speak my mind. It turned out that writing for PlasBit brought me one step closer to that future, because the company firmly believes in cypherpunk values.
The Brilliance of Cypherpunks
To answer what are the values of cypherpunk for myself, I had to dig deep into internet archives, poring over forgotten manifestos and whitepapers. It turned out that there was a group of dreamers in the early 1980s and 1990s who envisioned cyberspace becoming so rich and vast that we couldn’t live without it. To be offline would mean to live life in misery and poverty, detached from the world and the zeitgeist. But, powerful actors could take control of cyberspace and turn it into a digital prison for our minds, one where we would sign in voluntarily and relinquish all our privacy and autonomy for the promise of convenience and security. The only way to avoid that future was to ensure that the people were protected from abuse that would come from unfettered data gathering and analysis.
The cypherpunk values trickled into the public consciousness through a series of papers, such as “A Cypherpunk’s Manifesto.” Published in 1993 by Eric Hughes, the Manifesto laid out the core motivations of cypherpunks, who had seen the danger of cyberspace in which information propagated with infinite speed and remained there forever. Therefore, the Manifesto advocated revealing as little as possible in each transaction and communication because “information longs to be free.” To ensure privacy, Eric recommended using strong cryptography, in particular a cryptographic signature that guarantees only the intended party can read a particular message.
One interesting idea in the Manifesto was that cypherpunks intended to bring about change by writing and spreading software that others could improve on. In Eric’s view, software that was spread widely enough could not be destroyed, and a system based on it could not be shut down. If we think of software as information that comprises it, then we could say that software also longs to be free and, when liberated, it becomes an eternal part of a common digital resource.
Regulations on Cryptography
Another important motif in the Manifesto touches on regulations on cryptography. Encryption removes information from the public realm, which is a private act that should only concern the owner of the information, yet governments and state-level agents constantly try to curb encryption under the excuse of crime prevention. That would be equivalent to a government forcing people to walk around naked in public to show they have nothing to hide or forcing them to install surveillance cameras in their homes to help prevent crime. Answering what are the values of cypherpunk revealed numerous such attempts to curb encryption by governments and state-level agents, with the most recent attempt coming courtesy of the European Union in June 2024.
Dubbed “Chat Control law” and first proposed in 2022, this piece of Orwellian regulation includes a wonderful example of doublespeak as well, aiming to introduce something called “upload moderation.” In short, the law will curb encryption by scanning content sent through messaging apps before the encryption process; only what the scanning system determines is OK can be encrypted and passed on. In that way, encryption still exists but is made meaningless, and there is total surveillance of all content, any part of which can be blocked without explanation or human intervention. The scanning system would apparently use artificial intelligence and be deployed at the telecom level under the excuse of combating exploitative content featuring children.
Even if such a scanning system were to be introduced, there is no evidence it would stop a single instance of exploitation or even a single piece of related content; if there were, we would already have heard about it. Similarly strict measures designed to combat money laundering, which were also created under the excuse of combating crime, have an abysmal success rate. A 2020 study titled “Anti-money laundering: The world's least effective policy experiment? Together, we can fix it” reveals that anti-money laundering regulations stop less than 0.1% of criminal finances. Untold man-hours and resources go into excruciating KYC and regulatory compliance that has zero practical impact on criminals and can only target innocent people sending insignificant amounts to each other.
State-Level Agents
The only reason why the EU is scrambling to introduce the Chat Control law is because it doesn’t have any powerful corporations to do “upload moderation” on its behalf. The diversity of cultures, languages, and legal frameworks in the EU make the emergence of one such corporation unlikely in the foreseeable future. But, the US does have such corporations, and they gladly abide by whatever requests the US government issues, serving as a vanguard that spreads and maintains pax Americana in cyberspace. Corporations such as Meta (formerly Facebook) are given leeway as long as they do what they’re told, with the US government introducing new regulations to prevent competition from taking hold and upsetting them.
My research on what are the values of cypherpunks showed that cypherpunks saw various governments and state-level agents encroaching on our digital rights and freedoms, such as the freedom of online speech, through automated surveillance and behavior analysis. More importantly, they saw the advancement in consumer-grade encryption as the way to revolutionize digital transactions and digital communications and bring about a utopia without a violent confrontation with a government or a state-level agent. Even if you don’t care about encryption and have nothing to hide, corporate surveillance systems could use you as a proxy, gathering your metadata to spy on someone else, which is best seen with Meta’s “shadow profiles.”
Automated Metadata Gathering
Out of all state-level agents on the internet, Meta has done the most to intrude on the privacy of everyone, demonstrating how devastating a combination of incompetence and indifference can be. Time and time again, Meta has shown an astonishing ability to gather tidbits of anonymous information and correlate them to track people interacting online, creating vast databases that somehow always get hacked, and Meta only gets a slap on the wrist for it. Those databases include seemingly irrelevant information gathered from those who don’t have an account with any of its services. For each of them, Meta has created a “shadow profile” that is updated thanks to interactions with those who do use Meta’s services, laying dormant until the person signs up for a Meta service.
All Meta apps, such as Meta Messenger, ask for a wide array of permissions that basically turn the mobile phone into a surveillance device. For example, the app might ask for access to WiFi information before it can be used. That doesn’t seem like it could do much harm, right? But, if the app on Alice’s phone reports nearby WiFi names and WiFi signal strengths back to Meta, the platform can figure out when Alice is working, walking, jogging, having lunch, riding a bike, or sleeping. If Bob’s phone does the same, the platform can now figure out if the two are co-workers, neighbors, friends, lovers, spouses, or family, and that’s just by looking at WiFi information. Add in other information, and the platform suddenly has the terrifying potential to read people’s minds when they least expect it.
The Cambridge Analytica Scandal
The Cambridge Analytica scandal that broke in 2018 revealed how easy it is to build psychological profiles of people by looking at nothing more than their and their friends’ likes. The company, Cambridge Analytica, paid Meta to access the platform and run a survey on its users, who would be paid for answering questions. So far, so good; Meta allows this, and users agree to it when signing up for the platform. But, thanks to Meta’s lax security, with its founder Mark Zuckerberg famously stating that “security is not a problem you ever fully solve,” Cambridge Analytica was able to access the profile information of friends of survey respondents, including what they liked and their location.
The 270,000 survey respondents allowed Cambridge Analytica to access the profile information of at least 30 million Meta users. Cambridge Analytica used the information on what each person liked to create a probability-based profile for each person, with more information increasing the likelihood of a correct guess. For example, if Claire liked two particular movies, it could mean a 10% chance she has a dog; if she liked two other movies, the chance could now be 60%, and the dog was probably a golden retriever. If Claire also liked a certain organization’s page, it could mean a small chance her parents were getting divorced; another tidbit of information could mean Claire’s dad would get to keep the dog. The more information Cambridge Analytica gathered, the more powerful its ability to cross-reference Meta users’ profiles and build detailed psychological profiles for each. It gets even worse, because someone who knows this kind of information about people can run daily experiments on them and try to influence them.
Psychological Manipulation and Advertising
We only know about the Cambridge Analytica scandal because of Donald Trump’s election. In the wake of his win, the mainstream media in the UK and the US started looking for an explanation, resulting in a series of high-profile articles that brought attention to the issue of metadata gathering and analysis. The entire reason why Cambridge Analytica ran the survey was to gather information that could be used in political advertising by people such as Donald Trump, who saw the power of tailoring political ads to specific demographics. For example, if people in one region prefer dogs over cats, a political ad targeting them should include a dog. If their preferred breed is a golden retriever, including one in the ad makes it even easier for the candidate to seem relatable and likable.
Aligning an ad with what people already like is one thing, but the Holy Grail of psychological profiling is to craft specially tailored messages to change the behavior of specially targeted individuals. However, if those individuals are aware of it, the attempt might backfire, with the most notable example coming from Meta itself. In 2016–2017, Meta rolled out a series of changes to fight misinformation and the sharing of disputed news, which included putting a red exclamation mark next to controversial user content. As Meta later admitted on its blog, the exclamation mark had “the opposite effect to what we intended.” It turned out that people looking for controversial content were driven by curiosity, which appears to be the perfect antidote to manipulation.
The lesson here should be clear: we only know about surveillance, metadata gathering, and psychological manipulation when it fails. When it works, it’s invisible, which is why it’s so insidious and why it’s safe to presume we’re surveilled and manipulated at all times, in all places, and through proxies. Hence, we should do the best we can to shield ourselves from data gathering and psychological manipulation without losing curiosity or relinquishing our access to cyberspace.
My Bizarre Episode With Meta
I stopped using Meta in 2016 because I had a bizarre episode that indicated someone at the backend was tracking my private chats with my Meta contacts. One morning, I found on my desktop a perfectly formatted 130-page DOCX chat log with one of my Meta contacts. At first, I thought I must have clicked somewhere to download it, but no matter how much I tried, I couldn’t replicate the same document with the same impeccable formatting that included our tiny profile pics, links to our profiles, and message timestamps. After watching the 2016 movie “Snowden,” which shows how Meta gives people from intelligence agencies total admin access to Meta’s backend to the point they can edit user messages, I realized it was time to cut the platform out of my life.
Still, leaving Meta wasn’t enough to shake off that lingering feeling of being spied on. When I recently tried to make a new Meta account with a burner phone number and email address to send a message to a potential kindred spirit, the platform kept blocking me until I used my real name and date of birth. Meta already knew who I was thanks to my shadow profile, despite me deleting my old profile and asking the platform to purge all information it had on me. It’s been keeping tabs on me all this time; I’m positive it does the same to all of us. The best way to describe how it felt is to say it was like being in a panopticon.
Proposed in 1791, the panopticon (“the all-seer”) was a prison, the layout of which looked like a wheel, with a guard tower in the hub and inmate cells along the rim. The guard tower was designed so that a guard in it could see everything the inmates did, but they couldn’t see the guard. The idea was that the inmates would not know when they’re being watched and would behave in fear of punishment. Inmates who spent some time in the panopticon would learn to police themselves, living in the prison of their minds long after they’d been released from the physical panopticon.
My Dedication to Building Strong Systems
It’s only after I started writing for PlasBit in an attempt to answer what are the values of cypherpunk that I realized leaving Meta is not enough. It has become a global cyberspace panopticon that is affecting how we live, work, and interact with each other. Even if I avoid using Meta or the internet altogether, those near me who do use Meta or the internet represent proxies through which I too am dragged into a digital prison. Every asset Meta controls, down to the tiniest “Like” and “Share” buttons on websites, is there to reinforce the panopticon, track us all, and remind us we’re never free.
What I now know is that I have to work on building strong systems that are impervious to surveillance and manipulation, systems that are built by the people and that belong to the people. Cypherpunks talked about such systems and that’s how we got cryptocurrencies, such as Bitcoin. Cryptocurrencies didn’t appear out of nowhere; they were painstakingly built by generations of people who saw how much we need privacy in transactions and how much not having it can harm us. People such as you and me shared their ideas, dreams, and fears in private until they figured out a way to make their vision a public reality. I found the same open and welcoming environment in PlasBit, proving to me that it practices what it preaches.
Plasbit’s Alignment With Cypherpunk Values
From what I’ve seen writing for PlasBit and interacting with its employees over the past three months, it is a company driven by cypherpunk values. PlasBit aims to create an environment where privacy is respected as the fundamental human right, the one that enables us to think, speak, and interact without fear, and through which we can embark on a path to total autonomy. According to PlasBit’s manifesto, the company believes in achieving true financial freedom through a blockchain-based ecosystem that lets users escape the control of the conventional banking system. At first, I was skeptical, but now it seems possible.
Cypherpunk writings came from a genuine desire to improve the state of human rights, attracting more and more attention until they generated enough momentum to create a societal paradigm shift; based on how PlasBit has treated me, the company is going to be a similar generator of global change. All my writing requests and suggestions were taken into consideration, and I was treated as a valuable team member whose input matters. We shared findings, ideas, concerns, and laughs with each other, working on problems as they emerged and encouraging each other to stay true to ourselves.
Choosing a Different Approach
Each company has to decide on a set of core values to drive its business, but people can do it too. You can envision the ideal future for yourself, choose a set of values that will bring about that future, and search for companies with matching values. In my case, I was adamant that I would value my autonomy and find a company that did the same; in PlasBit, I found it.
In PlasBit, I discovered a company that doesn’t prioritize profit and that cherishes integrity and honesty as the foundation of a constructive, long-term relationship. In PlasBit, I found a true cypherpunk partner. I hope you can do the same for yourself and do the best you can to make your vision of the future come true. That’s the only way you can guarantee you’ll always be in good company.