
In the past few days my inbox has seen an influx in forwards from friends and colleagues, all sharing links with me covering the recent revelation that Facebook outsources some of its dirtiest work, and that those firms handling Facebook’s outsourced labor pay exploitatively low wages for some of the most psychologically damaging digital work imaginable: the screening of user-uploaded content (posts, images and videos) to Facebook. My colleagues sent these links my way for good reason: this topic has been the primary subject of my own academic research for the past year and a half, ever since I discovered these content moderation practices through a small news story in the New York Times. After reading it, I became riveted both by the workers and the industry it portrayed, as well as by the implications of this practice in the greater digital media/social media ecology. How do these practices change our collective notions of participatory media and understandings of the costs – financial and human – to use said media? What does it mean about the nature of our online participation, at one time heralded as a great direct-access equalizer, to know that content undergoes screening by unknown agents, who are often low-paid and low-status? What is it about the nature of social media that may encourage the creation and uploading of prurient, shocking or just-this-side of bearable content to be shared? Who benefits from such material? Who is put at risk? I wanted to explore, too, the impetus to conceal or render invisible these labor practices, virtually unknown to those outside the industry and yet an integral part of the production chain of user-generated digital media. These were just a few of a veritable laundry list of questions I generated based on my initial research on this topic. Since then, I have been documenting and writing about these labor practices and the workers involved, mapping them both in terms of their material nature as well as from a theoretical perspective, in my dissertation, Behind the Screen: The Hidden Digital Labor of Commercial Content Moderators.
Meanwhile, the latest chapter in the popular press’s up-until-now scant coverage of the story transpired just last week, when Gawker’s Adrian Chen filed his post entitled, “Inside Facebook’s Outsourced Anti-Porn and Gore Brigade, Where ‘Camel Toes’ are More Offensive Than ‘Crushed Heads’.” Chen’s story focused on practices at Facebook which, he discovered, takes place largely via outsourcing and micro-labor market oDesk see Brett Caraway‘s 2010 article referenced below for a nice overview of that company’s practices). Chen’s article is remarkable in a number of ways: first, he was able to focus on real-world examples shared with him by the workers themselves, most of whom are no longer working for Facebook via oDesk, and many of whom are located outside the US and in the so-called “Global South.” The workers’ accounts give concrete examples of both the kinds of egregious and trauma-inducing material they were exposed to, on the one hand, while on the other being paid wages that would seem to be nowhere near reasonable given the hazards of the work. Here it is interesting to note that much of the outsourced labor that takes place at sites like oDesk or at Amazon’s Mechanical Turk is undertaken on a per-item basis, so that workers are paid based on the number of items they are able to screen; I have taken to describing this practice as “digital piecework.” Secondly, Chen was able to provide the Gawker readership, thanks to the workers he interviewed, with a number of internal documents from oDesk, used for training and quality control by the content screeners. This type of material is generally not available for public view and is considered insider business knowledge; not making it public allows a company to maintain ambiguity about its screening and censoring practices via more general “user guideline”-style statements that give it plenty of room in which to operate when making subjective content screening decisions. This angle was another particular focus of Chen’s piece, where he pointed out the strange hierarchy of material, and how it is to be adjudicated by the screeners. While Chen’s piece, and subsequent takes on it in the blogosphere and in other sensationalistic coverage online, focus on the admittedly disconcerting nature of the material Facebook rejects, the more compelling facts rest just below the surface.
For example, the oDesk internal documents provide a great deal of insight into the kinds of material that the low-paid contract laborers could be expected to see. Images and videos of animal abuse and mutilation, child abuse (physical and sexual), gore, disturbing racist imagery and so on are frequent enough that all have specialized protocols devoted to handling them – keeping them from making it to the site if they’re not yet there, and removing them, if they are. As Wendy Chun noted in her 2008 monograph Control and Freedom, at the moment of commercialization of the Internet, the great hysteria demonstrated around pornography on the nascent Web and net, at large, was almost invariably directed at the consumer/receiver of the pornography – particularly toward the notion that children might be exposed to said material. Yet, she points out, there was a curious lack of concern for those involved in its production. Similarly, in the case of Facebook and of similar social media sites who employ screening to shield end users from exposure to disturbing or damaging material, we must ask why it is therefore okay for workers, often in other parts of the world to risk exposure over and over again to that very same kind of material. Is it only dangerous when viewed from the consumption side? Is the expectation that the meager wages offered for the work be able to offset the damage it may cause? Or is this work considered disposable by nature, hidden by design, outsourced by convenience and the necessity of finding a labor pool who will work in such conditions for very low pay?
We have a responsibility to collectively get real collectively about the true costs of these platforms. Just as we have learned that they aren’t free in terms of the labor we give away and the commoditification of our own demographic, usage and other kinds of personal data [Andrejevic, Fuchs, et al.], they also have a much larger footprint than we may realize in terms of the human cost. Some of these costs are better known among academics and activists whose interests lie at the intersection with social justice, labor and digital information/media issues, so for many of us the recent revelations about Apple and their relationship with Foxconn, a Taiwanese company who is the largest private-sector employer in China and known for its unsavory labor practices, as recently widely publicized on the terrific public radio program “This American Life”, were not surprising or new. Likewise, the issue of e-waste has been for years identified by numerous journalists, scholars and activists, but frequently fails to register on the radar of mainstream users. Silicon Valley-based labor activist Raj Jayadev has commented on this peculiar collective myopia, saying: “A profound characteristic the popular psyche has accepted about the Information Age is the presumption that technology is produced by some sort of divine intervention so advanced that it requires no actual assembly or manufacturing, the very same features our predecessors in the Industrial Era found so essential. Yet every computer, printer, and technological wizardry in-between bought at the local Radio Shack is birthed in what is usually a very inglorious assembly line production site.” This myopia extends, too, to the end of life of these products, frequently hidden from Western consumers via dispersal around the globe in sites in China, the Philippines, India, Ghana and elsewhere in the world.

Likewise, the myopia continues further into our interactions with our networked machines. Once they arrive to us, transmuted from the ether into material objects, we plug them in and then turn our collective cognitive dissonance to the Internet, where the predominating origin myth of unfettered possibility for democratic free expression, on the one hand, and the newer, unidirectional user-to-platform-to-dissemination media creation opportunities offered by “Web 2.0” platforms, still structure our interactions. And yet, the unveiling of the existence of these content screeners and the practices in which they engage certainly challenge the end-user’s perceived relationship to the social media platform to which she or he is uploading content, by adding new unknown agents and actors whose agendas, motivations and mere existence are all unclear. What else might be up for reevaluation? What other practices are worthy of another critical glance to identify the human values and actions embedded within them, and how does recognitions of them change our understandings of them? My colleague Safiya U. Noble, for one, is asking these kinds of questions right now about Google search in her own dissertation work. Indeed, it’s the erasure of these human traces, both physically and in a more abstract sense, that are so fascinating, and we must constantly ask to whose benefit those erasures serve. As a mentor of mine once quipped, ‘Human-computer interaction,’ I mean, what other kind is there?” Or, as the narrator of the This American Life segment mentioned above pointed out, prior to his trip to Shenzhen, a “special manufacturing zone” in China in which tons of products are made for export by people in factories, he had just assumed that his electronics were produced by robots – a fantasy much more comfortable than the reality he discovered.
In 2000, theorist Tiziana Terranova mapped the content production landscape thusly: “…the Internet is about the extraction of value out of continuous, updateable work, and it is extremely labor intensive. It is not enough to produce a good Web site, you need to update it continuously to maintain interest in it and fight off obsolescence. Furthermore, you need updateable equipment (the general intellect is always an assemblage of humans and their machines), in its turn propelled by the intense collective labor of programmers, designers, and workers”. Updated for today, we must add actors at both ends of Terranova’s Web content production chain: the social media users who constantly update and refresh sites’ content by uploading their own material or sharing that of others (almost exclusively for free, and for purposes of [re]distrubition, which sites offer as a service or feature to the user), and the video content workers who must screen that content before it can go live.
Human intervention and immaterial labor is indeed a key, and yet frequently hidden, part of the production chain in online sites that rely upon user-generated uploaded content to populate and draw in their producers/users/consumers. Content moderators, whose labor and even mere existence is so frequently hidden from view, nevertheless serve an integral role in making decisions that affect the outcome of what content will be made available on a destination site. Given the financial implications (read: benefits) and attention a viral video can have, the tasks performed by these workers are far from inconsequential. The content screeners also view large amounts of material that never makes it to the site it was intended for, as they deem it unfit based on site guidelines, legal prohibition, or matters of taste – labor that will literally remain unseen to anyone who may visit the site. And while the moderators’ work may not be as physically demanding, dangerous or rigorous as that of those workers whose labor goes into IT manufacturing, it indeed is often as disregarded, unmentioned or unacknowledged – and has its own potential for danger, of the psychic variety, in terms of the nature of the material to which the moderators may be exposed.
References
Caraway, Brett. “Online Labour Markets: An Inquiry into oDesk Providers.” Work Organisation, Labour and Globalisation 4, no. 2 (2010): 111–125.
Chun, Wendy Hui Kyong. Control and Freedom: Power and Paranoia in the Age of Fiber Optics. The MIT Press, 2008.
Jayadev, Raj. “South Asian Workers in Silicon Valley: An Account of Work in the IT Industry.” In Sarai Reader 01: The Public Domain, edited by Raqs Media Collective and Geert Lovink, 1:167-170, 2001.
Terranova, Tiziana. “Free labor: Producing culture for the digital economy.” Social Text 18, no. 2 & 63 (2000): 33.
Love your blog post and thanks for the shout out! I’ve posted a copy of the article to my blog for any of your readers who might be interested! http://safiyaunoble.wordpress.com/2012/03/08/bitch-magazine-article/
Thanks very much! This is a great article and highly pertinent to the discussion of my work above; the intersections between the systems you are discussing and the humans whose values and labor inform them are inextricable.
Yes, I very much agree. What we both appear to be trying to address are these seemingly “invisible” processes that “naturalize” the Internet and leave no room for questioning how the web is populated with content, and at whose expense. I particularly appreciate the way that you are bringing to light the human beings that are harmed in these processes of creating digital information. I liken it to the ways that I believe communities are harmed by misrepresentations that are either racist, sexist, pornified or in many other ways maligned. I look forward to reading the results of your work.