Abstract
Surveillance technologies, once limited to cameras and databases, have evolved into powerful tools of control. Driven by artificial intelligence, they no longer just influence behaviour but increasingly shape it, raising profound risks for democracy. This article traces the shift from surveillance capitalism to today’s societies of control, drawing on thinkers such as Foucault and Deleuze, and highlighting examples from the United States, Hungary, Turkey, and beyond. While the dangers of authoritarian drift are clear, the article also considers Carlota Perez’s argument that the same technologies could underpin a new golden age if harnessed for green growth, shared prosperity, and global inclusion. The future, it argues, remains a matter of political choice rather than technological inevitability.
Introduction
In 2018, Americans were shocked to discover that Cambridge Analytica, a British consulting firm, had harvested personal data from 87 million Facebook users, processed it, and sold it to Republican campaigns. The data was then used to target voters with personalised political messaging. This was not a one-off scandal. Instead of reform, the lesson politicians and corporations took was that surveillance works — and it should be expanded.
Since then, these techniques have been continuously refined. With the arrival of artificial intelligence, they are moving beyond influencing voters and consumers toward something more profound: controlling them. Look already at what AI in social media has done to society. With generative AI, surveillance is rapidly approaching the reliability of causal relationships. What began as cameras, data collection and algorithmic sorting is now a toolkit for authoritarian regimes, not just in China and Russia but potentially also in the USA.
The danger is compounded by today’s global context: polarisation and grievance are rising, surveys show majorities of people distrust institutions, and some even justify violence. There is clearly something fundamentally wrong in Western societies. Authoritarian leaders exploit this sense of crisis to justify stronger intervention and tighter control. Under such leadership AI will only further undermine trust and democracy, increasing anxiety, fear and grievances, and resulting in more polarisation and social unrest.
Surveillance, discipline and the societies of control
These developments were already visible to philosophers in the previous century. Michel Foucault (1926–1984) explained through the metaphor of the panopticon how people regulate themselves when they know they may be watched. Gilles Deleuze (1925–1995) later described our present as a “society of control,” where monitoring is continuous and flexible, embedded in networks and codes. John Stuart Mill (1806–1873) had earlier warned of the “tyranny of the majority,” where collective opinion could suppress individuality and dissent as effectively as laws.
Today’s algorithmic surveillance and predictive analytics bring these warnings to life: they reward conformity and punish deviation, often invisibly. Every smartphone reports our locations dozens of times a day, fitness trackers share health data with third parties, and social media apps log every pause of our scrolling. These are no longer just metaphors — they are the architecture of our digital lives.
Continuity of surveillance — from Prism to Cambridge Analytica and beyond
The Prism revelations showed more than a decade ago that mass surveillance was already used in global communications by the USA. Cambridge Analytica (UK) exposed how personal data could be weaponised for political manipulation. Russia and China now use similar tools to undermine democratic nations, control their own citizens, and suppress human rights. In the UK, such strategies played a role in the Brexit referendum.
Yet these scandals triggered almost no systemic reform. Data harvesting, behavioural profiling and cross-platform tracking remain intact — and are now being supercharged by AI. With AI, targeting becomes more precise, manipulation more subtle, and predictive power vastly more dangerous. Surveillance capitalism has not been dismantled; it has been placed on steroids. Worse, large AI systems lack integrity: they produce convincing outputs, but they shift positions easily, spreading disinformation while still appearing authoritative. In authoritarian hands, they are the perfect instruments of control.
Why technology makes control cheaper and easier
Technology lowers the barriers to visibility and accelerates reaction. Once, to monitor citizens required large bureaucracies. Now a handful of algorithms can analyse billions of data points in real time. Biometric identification, facial recognition tied to CCTV, predictive analytics that flag “risk” scores, and social-media manipulation all combine to make large-scale social management practicable.
These systems are already fine-tuned in China, where citizens are scored and tracked across every aspect of daily life. In Western democracies, similar infrastructures are in place — we simply tell ourselves they are for convenience. Most people accept this trade-off: location tracking helps us find restaurants, targeted ads feel relevant, security cameras make us feel safer. But the long-term cost is a society in which dissent can be mapped, targeted and suppressed with unprecedented efficiency. By 2025, nearly 50 billion connected devices globally mean each of us carries multiple surveillance nodes at all times.
Scholars Katina Michael and MG Michael call this condition “uberveillance” — surveillance taken to its logical extreme, where monitoring is ubiquitous, integrated across devices, networks and even bodies. The concept predates Shoshana Zuboff’s surveillance capitalism and captures not just the economic but the existential risks of being tracked and analysed at all times. With AI, uberveillance becomes not just a possibility but a governing strategy.
Concrete examples: algorithms and the politics of bias
Algorithmic risk scores in criminal justice and predictive policing have repeatedly shown racial and socio-economic bias. These are not theoretical flaws: studies show predictive systems misclassify and entrench injustice. In the USA, ICE already deploys databases and analytics to identify people for deportation. With AI, such tools become sharper, faster, and harder to contest — with devastating effects on real lives.
Why the United States matters now
The United States has long been seen as a bulwark of liberal democracy. But over the last decade, democratic principles there have been sliding, it is becoming an illiberal democracy. A government willing to weaponise AI, predictive algorithms and pervasive monitoring for partisan purposes could turn efficiency tools into instruments of control. The danger is not sudden totalitarianism, but incremental authoritarian practice: selective enforcement, targeted suppression and bureaucratic exclusion — already visible under the current administration.
The warning signs are clear and for everybody to see: racial profiling for deportation, political violence, undermining free speech, the use of the military against its own citizens, exaggerating crime rates, and the deliberate undermining of institutions. These tactics create fear, making surveillance — and thus control — appear acceptable, many business leaders, bureaucrats and institutions are falling into line . Meanwhile, authoritarian practices have also been embedded in supply chains, spreading surveillance capacities through the hardware we use every day.
Modern authoritarianism often preserves the outward forms of democracy — elections, legislatures, courts — while hollowing them out. Political scientists describe this as “competitive authoritarianism”; George Packer has called it “zombie democracy,” where institutions stagger on without real independence, giving the impression of vitality while serving the interests of power. This disorienting normality is precisely what makes the erosion of liberty so difficult to resist.
History suggests that populations often embrace control when life feels unstable. In Weimar Germany, widespread hardship made people willing to trade liberty for bread and order. Today, although living standards are higher, rapid technological and social change has created a sense of immiseration — alienation, stress and distrust. Alvin Toffler called it “too much change too quickly,” and historian Peter Turchin warns of “popular immiseration” as a precursor to civilisational crisis. In such conditions, promises of order and security become dangerously attractive, even when they erode democracy itself.
Historical analogies — a caution, not a literal equivalence
History shows what happens when technology meets ideology. The Nazis used mass propaganda to mobilise obedience, keeping people naïve and complicit in atrocities. The Stasi built a vast network of informants who spied on neighbours, friends and even family.
My father was arrested by the Nazis during WWII simply for speaking out. Imagine a regime with similar intentions now armed with AI-enhanced visibility. Unlike propaganda or informants, these systems operate cheaply, continuously and invisibly. It is not only surveillance — it is surveillance multiplied by machine intelligence, capable of scaling coercion in ways past regimes could only imagine.
Yet it is important to stress that 21st-century authoritarianism is not a replay of classical fascism. There are no jackboots or mass rallies announcing its arrival. As Packer has argued, it is unlike fascism in form. Instead, it preserves the surface trappings of democracy — elections, courts, media — while hollowing them out from within. This quiet erosion, cloaked in normality, is precisely what makes the danger so insidious.
We already see these developments in countries such as Hungary, Turkey, India and Venezuela, where opposition parties, courts and the press remain formally in place but are steadily weakened until they function as little more than stage sets. This pattern shows that in our modern times democracy today often dies not with a coup, but with a slow suffocation.
What to do now — guardrails for a digital polity
These technological developments are unstoppable, and there is little appetite in democracies under stress to resist them. We know what needs to be done, and I have written about it on many occasions, but the political will is lacking. The counterweights are clear: strong privacy laws, enforceable limits on surveillance, independent oversight bodies, and investment in public-interest technology. These are not technical tweaks but political choices. Yet governments have preferred convenience — for themselves and for citizens — over hard reforms.
And here lies the paradox. Convenience is seductive: tapping to pay, unlocking phones with a glance, targeted updates on what matters “to you.” But if the price of convenience is the infrastructure of control, then the responsibility cannot rest only on individuals “opting out.” It requires collective action — laws, institutions and civic engagement — to prevent governments and corporations from exploiting convenience as a pathway to authoritarianism.
Outlook and opportunities
While the trajectory toward surveillance and control is deeply troubling, history shows that new technological systems also bring opportunities. As Carlota Perez has argued, each technological revolution can deliver a positive “golden age” if institutions and societies channel innovation toward broad social goals. In the current transition, this could mean using AI and digital platforms not to manipulate, but to support greener growth, shared prosperity, and more democratic participation.
Perez singled out Europe as the region most likely to lead in shaping this new golden age. With its welfare traditions, environmental ambitions, and regulatory capacity, Europe, together with countries such as Canada, Australia and New Zealand could harness digital and green technologies for broad social benefit. Yet she warned that this will be a long process, requiring sustained political will and investment over decades rather than years.
Crucially, Perez also stressed that a genuine golden age must be global. Any new wave of globalisation must include the Global South. Without extending the benefits of digital and green innovation beyond advanced economies, inequality and instability will deepen, undermining progress everywhere. For her, a sustainable transition depends on ensuring that the South is not left on the margins but integrated into the heart of technological and economic development.
Conclusion — choice, not inevitability
People often trade freedom for convenience or security. But convenience must not become the path through which liberties erode invisibly. Authoritarianism feels surprisingly normal — until it doesn’t. The move from surveillance to control is not inevitable; it is political. With AI accelerating the risks, we must set guardrails now. Otherwise, future crises will not be managed through open democracy, but through systems designed for manipulation, exclusion and control.
And yet, there is still an alternative path. Perez reminds us that technology can be steered, and that the same tools used for control can be redirected for collective good. If societies can find the courage and foresight to act, there remains the possibility of turning a dangerous trend into the foundation of a more inclusive and sustainable future.
Paul Budde