Why Privacy is About Power, Not Secrecy
- Mar 3
- 6 min read
Michael Kingsnorth is the founder of Scramble Technology Inc. (Canada) and AperiMail Ltd. (UK). He writes about privacy, technology, and the future of human agency in the digital world, drawing insight from both entrepreneurship and philosophy.
Every government that has surveilled its citizens believed it was justified. Power rarely announces itself as abuse; it presents itself as a necessity. Every corporation that harvested behavioural data did so under a privacy policy that someone technically accepted. Every centralised database that was later breached was once described as secure.

After more than a decade of building and securing digital systems, what concerns me most is not the technology itself. It is how the privacy debate has been deliberately reduced to something trivial. We argue about consent banners and regulatory checklists while ignoring the structural question underneath: who holds power once data is collected?
Privacy is not about secrecy. It is about leverage. Whoever controls data controls interpretation, and whoever controls interpretation controls outcomes. It is about who accumulates behavioural, financial, biometric, and identity data, and who controls the interpretation of that data when political, legal, or economic conditions shift. In a volatile world, that question is not philosophical; it is practical.
The concentration problem
For more than twenty years, digital infrastructure rewarded aggregation. The more data collected, the stronger the predictive model. The more unified the database, the more efficient the administration. Centralisation delivered speed and scale. It also concentrated power at levels without historical comparison.
When identity records, health data, financial histories, biometrics, and location trails converge in central repositories, those systems stop being passive storage. They become instruments of leverage. They attract criminal enterprises, intelligence services, political actors, and corporate competitors, often at the same time.
By early 2025, cumulative GDPR fines had reached approximately €5.88 billion, with more than €3 billion imposed in the first half of 2025 alone. Enforcement headlines dominate, but the deeper shift is economic. Organisations that minimise data collection reduce breach exposure, shrink regulatory surface, limit litigation risk, and often lower cyber insurance costs simultaneously. Data restraint is not anti-growth; it is risk management with competitive upside. The less you collect, the less you must defend.
Compliance culture rarely confronts the central asymmetry. Once data is collected, the individual loses leverage. Consent is granted at a single moment in time, under conditions that may not exist in five years. Compliance manages the consequences of asymmetry. It does not eliminate it.
The institutional stability bet
The familiar argument that you have nothing to hide assumes stability. It assumes definitions of acceptable behaviour remain constant. It assumes data will only ever be interpreted within its original purpose. History offers no evidence that institutions remain static.
Ask a simple question: would you be comfortable if every message, search query, and location record from the last decade were placed in the hands of a government that does not yet exist, operating under laws not yet written? The hesitation most people feel is not paranoia; it is pattern recognition.
Recent policy developments illustrate the speed of change. In early 2025, the removal of multiple members from the Privacy and Civil Liberties Oversight Board left the independent oversight body without a quorum, sidelining its ability to function.
In Europe, the proposed Chat Control regulation advanced through EU institutions, prompting Signal to state it would exit the European market rather than compromise encryption.
In the United Kingdom, amendments introduced in the House of Lords would have required mandatory ‘on-device’ surveillance software on smartphones. Although framed as targeting specific categories of illegal material, such a mandate requires devices to inspect user data to determine whether it matches those categories. You cannot target content without evaluating it first.
China formalised a National Online Identity Authentication Public Service, linking citizens to a government-verified digital identity token.
These developments occurred within a single year across the world’s largest digital economies. When institutions hold comprehensive behavioural and identity datasets, individuals become dependent on those institutions remaining permanently competent and benevolent. That is not a privacy framework. It is a stability bet.
The AI amplification effect
Artificial intelligence has altered the risk equation.
A decade ago, a stolen database was static. It could be searched and sold, but analysis required human labour and time. Friction provided limited protection. That friction has disappeared.
Modern AI systems correlate, infer, and synthesise meaning across billions of records in seconds. Data once considered low sensitivity can be combined with leaked biometrics, location trails, and social graphs to generate identity profiles of extraordinary precision. Past breaches become more dangerous over time because analytical capability improves.
In an AI-saturated environment, the only data that cannot be exploited is data that was never collected. Collection creates optionality for others.
The privacy time dimension
Time compounds risk.
In 2024, NIST standardised post-quantum cryptographic algorithms because current public key systems are expected to become vulnerable as quantum computing matures. The harvest-now, decrypt-later model is not theoretical. Adversaries already capture encrypted traffic with the expectation that future computational power will unlock it.
Since 1994, long-lived secrets protected by quantum-vulnerable public key cryptography have carried a deferred quantum liability. Data retention is not only a present exposure. It is a future exposure that grows with analytical capability and computational power.
Minimisation is therefore not ideology. It is engineering logic applied over time. Risk compounds. So does data.
Selective disclosure and proportional access
A mature digital system does not require excessive disclosure.
If a service needs to confirm a user is over eighteen, it does not require a full date of birth. If an employer needs to verify a qualification, it does not require unrestricted access to academic history. Modern identity standards already enable decentralised identifiers and verifiable credentials that allow attribute proof without revealing underlying data.
In March 2025, Utah enacted a digital identity framework that explicitly prohibits surveillance, tracking, or monitoring tied to its use. It rejects identity models designed to phone home, where every credential presentation generates a trackable record. The question for any identity system is not simply whether it functions. It is whose interests it ultimately serves.
The builder’s responsibility
Architects of digital systems are making generational decisions.
Every database schema encodes a power relationship. Every retention policy is a bet on institutional continuity. Architectures that default to collecting everything and analysing later assume the surrounding environment will remain stable. That assumption deserves scrutiny.
The organisations most likely to endure will not be those that accumulate the most data. They will be those that demonstrate they require the least. Necessity is defensible. Excess is exposed.
Designing for an unstable future
We operate in an environment defined by acceleration and uncertainty. Decisions made under one regulatory, political, or technological context persist into the next.
Centralising vast quantities of personal data is not neutral. It is a wager on permanent stability. The emerging privacy economy takes a different position. Restraint reduces systemic risk.
Proportional access limits leverage concentration. Data never collected cannot be breached, subpoenaed, or weaponised.
Privacy is not about hiding from institutions. It is about designing systems that remain resilient when institutions evolve. Stability is not guaranteed. Architecture is.
The privacy economy will not be built by regulation alone. It will be built by architects, founders, engineers, and citizens who decide that excess collection is a liability rather than an asset. The systems we design today will determine who holds leverage tomorrow. The question is simple: will we continue to optimise for extraction, or will we design for resilience?
If you are building, investing, or governing in the digital economy, this is not a theoretical debate. It is a design choice. Make it deliberately. In a volatile era, that is not idealism. It is disciplined engineering and disciplined economics.
Read more from Michael Kingsnorth
Michael Kingsnorth, Technology & Privacy Contributor
Michael Kingsnorth is an entrepreneur, technologist, and writer exploring the boundaries between privacy, technology, and human freedom. As founder of Scramble Technology Inc. in Canada and AperiMail Ltd. in the UK, he develops privacy-first communication systems and decentralized identity frameworks. Through his blog, The Vortex of a Digital Kind, he examines how emerging technologies shape consciousness, ethics, and autonomy. His writing blends technical understanding with philosophical reflection, encouraging readers to question how we live, connect, and think in an algorithmic age.










