Data‑driven advertising isn’t just a new business model; it’s a cultural and political revolution. In her book The Age of Surveillance Capitalism (2019), Shoshana Zuboff argues that tech giants have created a new economic logic built on the extraction and monetisation of human experience. Far from simply providing search engines, social networks or video platforms, these companies systematically gather behavioural data, predict our actions and sell those predictions to the highest bidder. The result is a transformation not only of the market but of subjectivity itself – an evolution of what Theodor Adorno and Max Horkheimer once criticised as the culture industry: a system that turns art and thought into commodities, shaping consciousness for profit.
What is surveillance capitalism?
Unlike industrial capitalism, which commodified labour, surveillance capitalism commodifies behaviour. Zuboff traces how Google discovered that the “data exhaust” users leave behind when they type queries or navigate maps could be analysed to predict future behaviour. Over time, the company moved from simply improving services to predicting and influencing users, selling targeted advertising based on what it knows we will do next. Facebook, Amazon and hundreds of data brokers followed suit. This process, she argues, is not consensual; users’ data are taken without meaningful permission and used to train algorithms that shape what news we see, who we date and what we buy. The system thus operates through asymmetrical power: companies know everything about us, while we know little about how their predictions are made.
Algorithmic culture and the data economy
Scholar Ted Striphas coined the term algorithmic culture to describe how cultural forms are now curated by code. Streaming platforms decide which songs we hear and which films are recommended; social media feeds rank and filter posts. These processes are not neutral. They are designed to maximise engagement and, consequently, advertising revenue. The algorithms learn our preferences, not just to satisfy them, but to nudge them in profitable directions.
This dynamic mirrors Foucault’s idea of power/knowledge: power is productive, creating new desires and identities rather than simply repressing them. By analysing and organising our activities, algorithms generate knowledge about us that can then be used to influence us. In this sense, we participate in our own governance; we train the systems that will shape our future choices. Zuboff calls the resulting behavioural predictions surveillance assets – raw materials that can be packaged and sold in markets we rarely see.
From the culture industry to digital panopticon
Adorno and Horkheimer warned that mass‑produced culture turns individuals into passive consumers. Zuboff extends this critique: we are not just consuming standardised products; we are producing data that enable corporations to anticipate and mould our behaviour. Michel Foucault’s image of the panopticon – a prison design in which inmates never know when they are watched – is a useful metaphor here. In the digital realm, surveillance is continuous and often invisible. As Foucault observed, power produces subjects who police themselves. Today’s apps encourage us to share our locations, monitor our workouts or track our sleep, embedding surveillance into everyday life.
Why it matters
Surveillance capitalism raises urgent questions about democracy, autonomy and resistance. When a handful of companies control the infrastructure through which we communicate, shop and learn, they exercise unprecedented influence over public discourse. The data economy can exacerbate social inequalities, as predictive policing algorithms target marginalised communities or loan algorithms deny credit based on opaque criteria. As Zuboff notes, the danger is not just privacy loss but a future in which our behaviour is subtly engineered by entities we cannot hold accountable.
Understanding these dynamics is crucial for cultural studies. Theories of power, ideology and subject formation provide tools to analyse how algorithms are reshaping society. By recognising the continuities between the culture industry of the mid‑20th century and today’s digital economy, we can better grasp the stakes of our seemingly trivial acts of liking, scrolling and sharing. Informed critique is the first step toward developing forms of collective resistance and alternative infrastructures that prioritise human autonomy over profit.