On Polarization in the empire
On Polarization in the empire
Social media, search engines, and AI not only shape what we see – they shape who we become. How algorithmic logic perfects the bourgeois subject and reinforces cultural hegemony.

This is an exclusive guest contribution by Sociologist and philosopher Julian R. Vale.
Introduction
I argue that (1) in some neoliberal[1] societies, algorithmic processes can and do function as tools in eroding civic virtue by prioritizing individual preference over the collective good—specifically deliberation and respectful communication. This erosion, however, is neither inevitable nor unpredictable nor beyond our ability to influence. For detailed empirical support consider F.P. Santos,Y. Lelkes, & S.A. Levin on “Link recommendation algorithms and dynamics of polarization in online social networks.” This research review provides direct evidence that algorithmic design can influence social and personal dynamics in ways that reinforce division and reduce opportunities for deliberation and respectful communication. Consider their conclusion here:
“Finally, we note that recent work points out that an important distinction among social media platforms is providing the opportunity for users to tweak their newsfeed algorithm: Reddit offers that option and may be one reason to explain why this platform reveals less segregation than Facebook. Providing the opportunity for users to tune their link recommendation algorithm can inspire new intervention mechanisms.”
Assuming this empirical conclusion is enough to sketch out the problem and consider solutions, I draw from Colin Koopman’s genealogy of informational subjectivity and Marxist theory to sketch out this picture of contemporary experience of selfhood. I (2) propose general ways to politically and ethically understand and respond to these forces and the agents behind them. Some of these ideas are not new nor exclusive to a particular theory. Instead, this piece will be successful in its goal if I successfully persuade the reader of Claim (1). This piece will also be successful if I persuade the reader (2) constructive change is possible, for each of us, wherever we stand.
What do I mean by each of these heavy terms in claim (1)? By “algorithms” I mean computational processes employed by digital platforms (e.g., social media, search engine optimization, Chat-GPT etc.) to maximize user interaction and time spent on the platform. These “algorithms” personalize content, recommendations, and social connections to keep users continuously engaged. Liberal political theory often aims for a neutral state that enables individuals to pursue their own conceptions of the good life without external imposition. Neoliberalism, a particular mutation of the liberal framework, allegedly extends economic rationality to all aspects of society, fostering a competitive, self-interested subject, most especially in the imperial core. By “Civic virtue”, I mean the capacity for good-faith deliberation with political opponents, the willingness to consume information that challenges one’s own priors, or the prioritization of communal responsibility over individual convenience. The “algorithmic design”, within the liberal and neoliberal framework, prioritizes the satisfaction of individual desires, choices, and self-interest, potentially at the expense of shared societal standards, responsibilities, or communal interests. My claim is not to say we have a deterministic picture of the self, quite the opposite. What I am saying is that we should be aware of how our moral, cultural, and political intuitions are shaped by and in turn shape the environment we inhabit, most especially through exposure to information.
Why should we accept this picture? And what should be done? I hope to offer some answers to these questions.
The Crisis
Koopman’s genealogy (how a subject comes to be)[2] of the “informational person” reveals how digital technologies contribute to the erosion of civic virtue by shaping individuals into self-interested, market-oriented subjects. We should accept Koopman’s claim for the following reasons. By intensifying individualistic desires, normalizing economic rationality across social domains, subtly “depoliticizing” collective concerns, social media algorithms reinforce existing societal divisions under the guise of individual empowerment and free choice. Thus, and according to Koopman, molding a specific kind of self—”the informational person” or entrepreneur of the self—that is primarily oriented towards individual satisfaction and market competition, rather than robust collective well-being or civic engagement. In some way, the person constituted by information has existed since the invention of language and communication, but our digital environment has exacerbated the use of data information in shaping our current landscape.
Algorithms connect individuals to predefined categories and formats, accelerating recognition and interaction—and in the process, shaping how people choose to see themselves and others. This leads to a “datafied subjectivity” or “database self” where individuals are “constituted, and not merely mediated, by our data” (Koopman 7). This process is deeply intertwined with the neoliberal ideal of individuals as “entrepreneurs of themselves,” viewing themselves as “capitalist micro-firms who must constantly remodel themselves in order to compete on a labor market” (Callinicos et al 379). This understanding of the self is consistent with a model of economic rationality where individuals are “fundamentally self-interested and rational beings who will navigate the social realm by constantly making rational choices based on economic knowledge and the strict calculation of the necessary costs and desired benefits” (Oksala 128). This focus on individual utility and “maximum welfare” is embedded in economic theories of rationality (Koopman 13).
Neoliberalism and those who consciously act within it, as a form of “governmentality”[3], shifts societal understanding from one of class antagonism to an economic game for self-interested individuals. Neoliberalism and neoliberal actors recast social justice issues as economic calculations, stripping them of moral and political urgency. As Johanna Oksala argues, social problems, including violence, become depoliticized, framed as economic rather than political or moral issues. This “economic ontology” fosters a society where “everybody is a capitalist, an entrepreneur of himself,” effectively blurring class distinctions and promoting the idea that “ultimately everybody wants the same thing: to succeed in their enterprise and to win in the economic game” (Oksala 131). Cultivated self-interest reduces the scope for collective moral and political action aimed at broader societal justice. The state’s role, in this view, is to ensure conditions for competition and entrepreneurial conduct, often through “effective policing” rather than social provision, linking state violence inherently to neoliberal governance (Oksala 141).
The prioritization of individual preference inherent in user-engagement algorithms within the liberal framework actively undermines collective cohesion. The “sharing or common interests in goods and forms of entertainment creates a false sense of unity and equality,” which can lead to “an erasure of the space for critical consciousness” (Marcuse 8). This can manifest as identification with powerful figures even when it’s against one’s own class interest. Under a certain conception of liberalism, individuals and groups ideally “flourish” by capitalizing on individual differences and personal accountability, but certain emphasises on individual differences can exacerbate group antagonism rather than fostering collective action across differences.
Capitalist ideology manipulates identity by promoting multicultural forms of equality and consumer empowerment, while simultaneously deploying these for capital accumulation and the maintenance of class rule through the manipulation of difference. This process, termed “repressive desublimation” by Marcuse, “extend[s] liberty while intensifying domination,” effectively neutralizing potential challenges to the system by absorbing them into individualistic, consumerist forms.
To understand how algorithms shape what Colin Koopman calls datafied subjectivity, it is useful to examine concrete examples from widely used digital platforms. These platforms do more than mediate our experience of the world—they actively participate in co-constructing who we are, by formatting our choices, values, and expressions according to logics of engagement, optimization, and predictability. They also demonstrate the ways in which we participate in this co-creation.
Consider TikTok, where the For You Page (FYP) curates a stream of videos based on subtle behavioral signals—how long a user watches a clip, what they skip, which videos they rewatch. Rather than merely presenting content, the FYP can shape users’ sense of humor, political views, and even social values, often without their conscious awareness. Communities on TikTok—such as “BookTok” or “FinanceTok”—form not through deliberate association but through algorithmic sorting. Moreover, users frequently adjust their own behavior to appeal to the algorithm, adopting trends, hashtags, and editing styles favored by the platform. Identity can become reactive and performative, optimized for algorithmic amplification.
Google Search provides another illustration. By personalizing search results based on a user’s history, location, and demographic profile, Google effectively filters the world’s information into individually curated epistemologies. Even the act of questioning is shaped by Google’s autocomplete suggestions and “People Also Ask” panels, which nudge users toward certain framings of a topic. For example, typing “Is climate change real” versus “Why is climate change a hoax” leads to two vastly different informational landscapes. The questions we ask and the knowledge we acquire are thus entangled with algorithmic logic, contributing to an epistemic environment that can reinforce polarization and undermine shared understanding.
The most concerning development is of course, the proliferation of Large Language Models (LLMs), such as ChatGPT, Perplexity, Claude or Gemini.
LLMs can assist in information collection but LLMs still require the user to validate and sort through this information. Outsourcing one’s critical thinking skills or tasks to AI is an increasingly common development in high school, college, industry, law and beyond. The danger is, who gets to control what information the AI has access to, or how to validate this information? Is public reason enough? How do we decide which information is true or not true relative to a particular discipline or system?
In all these examples, we see that algorithmic systems do more than respond to human behavior—they help shape it. They foster forms of subjectivity that are increasingly individualized, optimized, and legible to systems of control and profit. The result is not merely a more efficient user experience, but a deep transformation of the self into something data-driven, economized, and often depoliticized. Understanding this dynamic (which happens beyond just the algorithmic space) is central to resisting the erosion of civic virtue and imagining new modes of solidarity in a data-saturated world.
To again be clear, this is not to say we are all passive agents being controlled by the media or the information we choose to consume. It is to take seriously the picture of subjectification in our contemporary society. If we take Koopman seriously, that also means we each possess power to resist polarization at the individual level and potentially the collective level. This is most obvious with the algorithmic processes, where one can select and choose what one consumes, but one’s control over this information isn’t always obvious all the time, particular during adolescent development. A certain type of libertarian might insist that the picture I’ve painted is pretty dark. Humans are agents they might say. A libertarian agent has the capacity to resist their environment. I don’t disagree entirely, but certainly the information we consume influences what decisions we see as available on the horizon. For Marx, who we are at a particular moment is largely a matter of our place in history, specifically our class position. The history of society is fundamentally a history of class struggles. So to pretend our position in society is not influenced by the material and economic conditions, that in part, give rise to subject formation is to align oneself against a basic fact of human existence. That is, one’s class position. Capitalism is increasingly in the business of capturing existing desires and generating new desires. It has shifted society to a desire-culture.
What Can be Done?
Having outlined the mechanisms of algorithmic subject co-formation, I now turn to the question: What can be done to resist and reshape this digital environment? It is obvious that for the U.S., social media is one of the strongest “exports” the nation can still claim. If the wielders of U.S. foreign policy are interested in maintaining global hegemony, social media companies hold much power over shaping the content people have access to globally and domestically. Same, of course, goes for Google and other search engines.
Likewise, those with access to influencing the development and regulation of social media consumption have power in determining how strong the influence of social media can be, especially how strong an influence social media has over a person becoming who they are. Those of us who recognize the danger of social media and the internet in fulfilling certain psychological and social needs that are apparently basic to our condition in contemporary life also have the power to speak up and advocate for change. That is, change for ourselves and for our societies. If the information we consume influences to some degree what our views and opinions are, we should mobilize our efforts to more carefully manage the way information is presented to us. Or simply, to foster a “healthy” bit of skepticism towards what we read and consume.
For Marxists, Critical Theorists, and members of liberal or illiberal democracies around the globe, cultivating algorithmic literacy by being aware of these micro and macro processes can help those on the ground and those influencing policy know where the danger lies. If algorithms participate in co-constituting who we are or at least the information we have access to and consume, then we must consider reconstituting ourselves and our societies. Reconstituting to be the type of people or type of society we want to be or have. The task is not simply to “reclaim” a past civic virtue, but to invent new forms that are adequate to the historical and technological conditions of our time. This is a philosophical project, a pedagogical project, an existential project and above all, a political one—that is, a project that asks how we should live together as persons among people.
It means asking whether our agreed upon ways of life provide a basis for restricting or controlling the influence of algorithmic processes on young people particularly. It means asking if we want to reverse the polarization we are experiencing in our societies. It means asking if subjectivity is indeed shaped, in part, through information and algorithms, what kind of collective institutions—and what kind of individual imaginations—are required to shape a more just, solidaristic person? And just as important: a more just, and solidaristic society? Or does it mean using the power of algorithmic processes to shape society in other ways?
Columbia professor Tim Wu tells of how figures like Brandeis and Theodore Roosevelt first confronted the democratic threats posed by the great trusts of the Gilded Age – but the lessons of the Progressive Era were forgotten in the last 40 years. He calls for recovering the lost tenets of the trustbusting age as part of a broader revival of American progressive ideas as we confront the fallout of persistent and extreme economic inequality. Perhaps that is one solution for liberal democracy. For the Marxist, dismantling the capitalist structures that commodify human attention, data, and social relations, while building collectively controlled alternative is another.
While I’m not interested in telling the reader how to structure their life or their society or theories, I am interested in encouraging the reader to engage in good faith dialogue with their interlocutors. A re-design of digital platforms to prioritize “good-faith deliberation” and “communal responsibility over individual convenience” are potential solutions for the liberal or neoliberal. Using the platform for destabilization is another option for those who are interested in social or political change. Whatever the reader’s specific social or political preferences may be, certain algorithmic processes can be used for viewpoint diversity or for prioritizing a certain viewpoint. That we can be more aware of and intentional about these issues is my hope.
For those of us who are interested in real material change, emphasizing and resisting the material conditions that constructed the environments that shape men and women into the types of beings that they are should be our foremost priority. How exactly this can or should be done remains an unexplored area in this essay, as well as rebuttals from the libertarian, the capitalist, and of course other Marxists. I invite those who are troubled by the picture I’ve painted to use their power to resist domination by these forces or the actors contributing to it. And to bring the tools of their various disciplines against it.
More importantly, we can unify and work together towards our shared interests against our various oppressors—the capitalist class being the greatest threat.
Works Cited
- Callinicos, Alex, Stathis Kouvelakis, and Lucia Pradella, editors. Routledge Handbook of Marxism and Post-Marxism. Routledge, 2021.
- Koopman, Colin. How We Became Our Data: A Genealogy of the Informational Person. University of Chicago Press, 2019.
- Marcuse, Herbert. One-Dimensional Man: A Study of Advanced Industrial Society 1964.
- Oksala, Johanna. Foucault, Politics, and Violence. Northwestern University Press, 2012.
- Wu, Tim. The Curse of Bigness: Antitrust in the New Gilded Age. 2018.
[1] What exactly neoliberalism is controversial. For the purposes of this article, neoliberalism and neoliberal actors operates through the universal application of economic rationality, underpinned by the theory of the self-interested rational actor). Ultimately leading to the ideal of deregulated government and deregulated market sectors.
[2] Genealogy is a historical method developed by Foucault that aims to understand how our present concepts, practices, and forms of subjectivity came to be. It is often described as a “history of the present”, designed to make current conditions intelligible and open to critical examination and potential transformation.
[3] Neoliberal governmentality is a rationally coordinated set of governmental practices that emerged primarily from the American Chicago School economists after World War II, which extends market rationality and principles as a framework to all spheres of life to actively shape and manage populations by cultivating self-interested, entrepreneurial individuals as capitalists of themselves, thereby depoliticizing many social issues