Works in Progress

Sareeta Amrute

What the Facebook Files Tell Us About Racial Capitalism

Note: This is a draft of an essay forthcoming in ACM: Interactions. Please do not cite without contacting me first!

On November 12, 2021, the public pension fund for the state employees of Ohio filed a class action lawsuit filed with the SEC alleging that they purchased Class A Facebook stock at artificially inflated prices. These inflated prices were the result of Facebook’s intentional cover-up of the harms its products caused around the world, the revelation of which by Frances Haugen in the Wall Street Journal caused substantial losses to the fund and its beneficiaries. As the court filings state, this case “arises from an egregious breach of public trust by Facebook, which knowingly exploited its most vulnerable users—including children throughout the world—in order to drive corporate profits” (2).

The harms done by Facebook products discovered by Haugen’s whistleblowing as reported in the press largely focused on the company’s efforts to make products enticing to young people (as young as 9 years old) even while their products, especially Instagram, were deemed to be dangerous, especially to teenage girls’ wellbeing. In a somewhat lesser-reported but still noteworthy relation, Hagen’s testimony described Facebook’s negligence in how its products were used to foment large-scale, genocidal, violence against ethnic and religious minoritized populations outside the United States. As New York Times correspondent Kevin Roose noted on an interview given to National Public Radio’s Brook Gladstone, while Facebook has “billions of users,” it does not value them all equally. In the “rest of the world”, it has “too many users” and “barely anyone who can manage the platform.” In the United States, however, it is chasing younger audiences in a bid to stay relevant to advertisers, since it is the young users “that advertisers are trying to reach”. While these may be separate, even diametrically opposed problems, they are nevertheless interlinked.

Chasing audiences of young users—and we must admit, middle and upper class young users—in the United States has directly to do with their’ buying power, the consumer dollars they and their parents command, which are irresistible to advertisers, who makes up the backbone of Facebook’s revenue streams.

In the majority world however, the average individual commands significantly fewer resources, and therefore, proportionately less in social media business models. It is no surprise then that while putting significant product development resources into developing the lucrative youth market, such as through the ill-fated Insta Kids, Facebook has shown little time, interest, or inclination to provide the basic safety features for people living anywhere else. These features would include area experts, language experts, representation for positions of authority as well as in the everyday work of content moderation of people from caste-oppressed, Muslim and Christian communities, and at its most basic, more funds and support for teams working on trust and safety outside the US, even when these team propose solutions that contravene the capital-accruing tendency of the company itself.

The sharp inequities exhibited by these revelations of the over-heated pursuit of young eyeballs regardless of deleterious effects on youth well-being on the one hand and callous disregard of how the platform is used to propagate violence and hatred for other populations on the other, suggest an uncomfortable fact: race, place, and position matter deeply to these tech companies, and not in the ways that their DEIA handbooks might suggest. As such, Facebook files exhibit a classic case of racial capitalism.

In Cedric Robinson’s classic definition, which recently has been described fairly as inchoate but nevertheless remains influential, racial capitalism simply refers to the fact that capitalism could not have developed without turning some into biologically-predestined ‘haves’ and others into biologically pre-destined ‘have-nots’. The concept of racial distinctions, which predated the development of industrial capitalism, provided such a convenient alibi. Though, the long rise of industrialization depended both on the colonial and elite rapaciousness of taxation, plantation, and the expropriation of Native lands, and on the enslavement of people who could be considered endlessly workable and less that human, as demonstrated in the scholarship of so many thinkers, from Jotirao Phule and Vine Deloria to Sylvia Wynter and Denise Ferriera da Silva, Paula Chakravartty, Paula Ricaurte, Thenmozhi Soundarajan, and Suzanne Kite.

Though these developments might seem safely ensconced in the deep recesses of the 19th century, the way race, and cognate concepts like caste, divides up the world matters to even such new players as social media tech companies. As developed in the scholarship of Tressie McMillan Cottom, Robin Kelley, Gargi Battacharya, and I in different contexts, racial capitalism implies how these divided up populations are included in the dreams of tech companies in multiple, and even contradictory ways. On the one hand, some segments of racialized populations may be included as future consumers, to be courted as valuable users. And, at the very same time, other segments are included as disposable populations, whose safety and well-being can be sacrificed to the metric of simply having billions of users worldwide. These same disposable lives are used up in another way—when they produce the violent, invidious content that then circulates virally online. In other words, the problem of race and technology cannot be solved through the inclusion of Black and Brown populations, because they are already included as populations that labor and from which value is extracted, and to which inferior, faulty, and violent products may be developed and offered up. As Charise Burden-Stelly argues, racial capitalism produces a calculation of value-minus worth: some bodies, especially black bodies that intersect with systems of imprisonment, experimentation, and risky work, produce surplus value at the same time that their lives have little worth.

Given Facebook’s negligent attitude toward quality control outside the US, one may justifiably ask whether Facebook as it appears in places like Nepal, Sri Lanka, Burma is even the same product as what is offered up in the United States. Given the paltry numbers of content moderators, the company’s lack of linguistic and cultural competence, and its failure to take the findings of Dalit and Muslim researchers seriously, it does not seem to be.

We have to ask, what would have happened, or may been avoided, if the company had taken all the money, the energy, the concern over, the brain power devoted to chasing young Insta users (to their detriment) and had put them to protecting lives in its pre-existing products? Of course, this will not be done as long as companies like Facebook continue to see the peoples of most of the world as expendable, useful only terms of their masses.  Even for well-off populations in the United States their ability to turn discourse toward different ends is latent; it may remain largely metaphorical and comparative, especially because here too the glittery optics of who gets to speak, about what, and for what audiences govern a kind of capitalism that relies so heavily on raced and gendered imaginaries of expertise and trustworthiness.

In other words, the framework of racial capital is so necessary in this current moment to move us beyond inclusion framed as a binary or even as a first step on a path toward something broadly democratic. Instead, we need to show how seemly divergent problems are actually yoked together by processes of inclusion that make the terms and conditions of that inclusion disastrous and dehumanizing, in different ways and different modalities, across the globe. We need to broach very real questions about what precisely are these multiple versions of Facebook—one that targets children, another that excuses some from obeying the rules of posting, a third that tweaks the algorithm and ignores its own internal evidence of the cycles of violence that devolve from those tweaks, a fourth that is a marketplace for human trafficking, and yet a fifth that treats entire swathes of its members as disposable lives.

We need to ask whether these other Facebooks that operate beyond the logics of consumer eyeballs are ‘junk technologies’ foisted on populations who never asked for them, and where else we can look for alternatives. These are questions made possible by recognizing the multiple ways that capital, race, and technology are entwined, and these are possibilities that arise from that analysis, when it becomes sharply clear all the places and beings left out of these calculations. These are the places to go for alternatives, if your analysis can take you there.

The chances of this current lawsuit’s success are beyond my capacity to gage. The filing nevertheless makes for compelling reading. The plaintiffs are arguing for damages to compensate for the losses they incurred when the company’s stock dropped after the information contained in the Facebook Files hit newsstands. It alleges that the leadership of the company willingly mislead shareholders in several meetings about the nature of their business and the degree of risk the business was taking on. It even misled its own Oversight Board. On the terrain of contemporary business practices, misleading shareholders is a more prosecutable offense than is going about things in the usual way, that is, by treating Black and Brown populations as extractable, extra, destroyable life. But, on the terrain of moving past the moment of social media behemoths and their setbacks, it is precisely this usual way of doing business and its ability to sustain the power elites across the world that has to be undone.

%d bloggers like this: