'We don't exist to them'

By Estelle Atkinson, Alexandra Goldberg, Marie Louise Leone and Piper Vaughn

The troubling reality of content moderation

Section 230 says that tech platforms don’t have to limit graphic content. The implications of this fall hard on the workers who moderate it.

Ed Stackhouse says seeing internet posts of war violence, suicide and mass shootings as a content rater doesn’t get any easier.

"If you've got any shred of humanity left in you, you're not callous to them," Stackhouse said.

The 49-year-old employee of Appen-RaterLabs has been rating graphic content for Google as an independent contractor for 10 years.

Faced with low wages, a lack of benefits and minimal in-house health services, Stackhouse, who lives in Asheville, North Carolina, and his fellow content raters, pushed to unionize. In January 2021, the Alphabet Workers Union launched publicly. Raters began to join alongside Google employees, vendors and contractors in 2022.

Content rater Ed Stackhouse at his workspace. (Ed Stackhouse)

After sorting through graphic content day after day, the raters have decided to fight back. In June, Stackhouse and six Appen-RaterLabs employees filed an Unfair Labor Practice (ULP) report after being fired from the company, which they viewed as retaliation for unionizing. The report is currently under review with the National Labor Relations Board, but is thought to be the catalyst for the employees being rehired just weeks later.

Appen-RaterLabs, a subcontractor for Google, employs a team of more than one million remote workers. Some workers can opt-in to comb through potentially graphic content and flag material they deem to be pornographic, upsetting or offensive. Raters are on the front lines of content moderation and make decisions about whether or not to flag content, but higher-ups have the final say. The raters do not have the ability to remove content, and they often don’t know whether graphic content is ultimately taken down. From the little contact they’ve had with their union "siblings" at Google, they believe their ratings go to Google engineers who review the comments and flags. However, this theory is unconfirmed, according to Stackhouse.

"We are ghosts, we don't exist."

— Ed Stackhouse

RaterLabs employees follow a manual called the General Guideline, which contains over 160 pages of rating rules and concepts, according to a job resource website. Some tasks also have individual rules for raters to follow, but most of the time Stackhouse says the raters must use their best judgment.

The content they examine varies in extremism. Stackhouse said he can see gory medical content, pornography and suicide forums, which impact his well-being the most.

"I just want to be able to reach out to this person and wrap my arms around them and say, 'we love you, let's figure out something better.' But we can't do that, we can't interact," he said.

Just as this content may be distressing for internet users, it is also distressing to the content raters employed to spend hours reviewing and assessing it. Despite the harms posed by graphic content online, big tech companies have no legal incentive to remove it. They are granted blanket immunity by a legal provision contained in the Communications Decency Act: Section 230.

replace the iframe code below with the embed code from youtube -->

Section 230 provides that digital media companies will not be treated as publishers for legal purposes, meaning that they will not be held liable for either removing constitutionally protected content or hosting harmful or illegal content.

According to Stackhouse, content raters are invisible to Google, and are excluded from conversations about fair wages and better benefits.

"We are ghosts, we don't exist," he said.

The war on wages

RaterLabs employees are not granted livable wages, health care and retirement benefits due to their status as independent contractors, Stackhouse said. In 2022, when content raters started joining the Alphabet Workers Union (AWU), Stackhouse was among the first seven known raters admitted. Now, he says, the number of raters in the union is in the 80s.

Stackhouse believes he and five other employees were terminated on May 31, 2023, because they were vocal about unfair working conditions or involved in the union. Appen said they were fired due to "business conditions," according to Stackhouse.

"I'm sure they wouldn't have offered us the jobs back if we didn't file the ULP against them," he said.

In one campaign, raters sent emails to Appen CEO Mark Brayan expressing their concern about workplace conditions. This resulted in a meeting in November 2022 with Brayan in which six union members demanded increases in pay, benefits and transparency from management. This signaled a major win for the union — raters earned their first raises in 10 years at the beginning of 2023.

Due to the raise, raters now make $14 an hour. In an email viewed by USC Annenberg, Appen said that this marks a pay raise of $4. However, RaterLabs employee Jay Buchanan said it's a $1.50 increase from their previous rate of $12.50 an hour.

Further, Stackhouse had started out working 40 hours a week for Leapforce, but when that company was acquired by Appen, his hours were cut to 26.

On Feb. 1 of this year, more than 200 Google employees went to Google headquarters in Mountain View, California to deliver a petition signed by more than 500 raters, which was refused by Google. That same day, Appen sent out a "union busting email" that described consequences for organizing, Stackhouse said.

According to sections 7 and and 8 of the National Labor Relations Act, employees have the right to join a union. The law states that employers cannot convey a message that "selecting a union would be futile." An email sent by Appen to raters on Feb. 1, 2023, which was reviewed by USC Annenberg, stated that RaterLabs will "protect the rights of all eligible employees" to decide whether or not to unionize.

However, the email also contained the following statement regarding unionization: "A Union can promise many things but your pay could increase, decrease, or stay the same with a union. While in a union, YOU WILL PAY DUES" (original emphasis).

The email also cited the benefits of working with RaterLabs management firsthand, stating "...we believe working directly with workers is best."

Section 7 of the Act also gives workers the right to refrain from joining a union, except where company policy dictates that it is a condition of employment. A copy of a contract presented to raters by Appen, viewed by USC Annenberg, does not contain mention of unionization.

The Google Playa Vista Campus in Los Angeles, CA. (Marie Louise Leone)

Google's headquarters in LA, which the company built in a repurposed airplane hangar. (Marie Louise Leone)

Google and YouTube headquarters in LA. (Marie Louise Leone)

Meta's local headquarters in Los Angeles, CA. (Marie Louise Leone)

The same email sent by Appen said that "...if Raterlabs takes steps to form a union and it passes, EVERYONE will be required to be in the union and pay dues. No one gets to opt out" (original emphasis).

Google spokesperson Courtenay Mencini stated that Google is not the employer of Appen workers.

“Appen, as the employer, determines their working conditions, including pay and benefits, hours and tasks assigned, and employment changes – not Google,” said Mencini.

“We’ve long had contracts with unionized suppliers and do not treat anyone differently should they choose to join a union or not. We, of course, respect the labor rights of Appen employees to join a union, but it’s a matter between them and their employer, Appen,” she said.

Appen-RaterLabs has not responded to repeated requests for comment.

The digital workplace

Jay Buchanan, 31, who uses they/them pronouns, has worked as a rater with Appen-RaterLabs for seven years and has been a member of the union for over one year. Buchanan, who lives in Asheville, North Carolina, says they "opted in" to see upsetting and offensive content, including pornography, illegal activity and violence, in part to help mitigate racist sentiment in those spaces.

The workspace of content rater Jay Buchanan. Buchanan has worked with Appen since 2016. (Jay Buchanan)

Ed Stackhouse's workspace. Stackhouse has worked with Appen for 10 years. (Ed Stackhouse)

Buchanan says Appen needs a better welfare system in place for employees, due to the graphic nature of the content they review every day. According to a RaterLabs newsletter obtained by USC Annenberg, employees have access to articles and resources for well-being, including tips on doing exercises at your desk and getting better quality sleep. Employees also have access to the Employee Assistance Program, a confidential counseling service.

Jay Buchanan, content rater.

Buchanan says this service is only free for three sessions per year. "It is absolutely inadequate," they said. "This is what disgusts me."

"It is absolutely inadequate."

— Jay Buchanan

Stackhouse adds that timing is a major concern among raters. In some cases, raters are allotted just minutes to fact-check the accuracy and quality of AI generated content.

"You could have a load of points of data that you have to check the accuracy of, but only get two minutes to do it," Stackhouse said.

If a rater goes over the allotted time for the task, they are locked out of the rating system and flagged for productivity. If they fail to be accurate in their ratings, they may be fired or furloughed over a bad accuracy review.

"So you really have to weigh which way you are going to go," he said. "Am I going to be ultimately super accurate and take too much time and get locked out? Or am I going to be as accurate as I possibly can and cut it off quickly?"

Stackhouse provided an example of a task in which AI generates inaccurate details about the Civil War.

If raters do not train the AI by identifying falsities, the inaccurate information will be uploaded to its databases, Stackhouse said. "Ten to 20 years down the road, as people are continually talking to AI and they're no longer looking at books, they're deceived. And all of a sudden, history has changed."

Ed Stackhouse, content rater.

The same goes for political candidate bias, where AI has the potential to direct users to incorrect information, Stackhouse said.

"This could mislead a lot of people. And it could eventually threaten democracy itself. So it's irresponsible what’s going on right now," he said.

Stackhouse is a point person within the union and even signed a letter to Congress expressing doubts over raters’ abilities to moderate artificial intelligence because of the state of their workplace.

The legal landscape

It's not clear that the industry will undergo broad changes anytime soon. Looking forward, the legal position of big tech giants, and their consequent responsibility for the content on their platforms, depends on the fate of Section 230.

The provision has been deliberated in court on several occasions over the last few months, with the most notable cases, which were handed down in May, upholding Section 230 and the protection it offers big tech.

The Legislature has made attempts to regulate the digital media landscape in the form of a bill that would end Section 230 immunity for digital platforms, specifically in relation to AI. However, despite this bipartisan effort, Section 230 is a politically divisive topic.

Shoshana Weissmann, digital director and fellow at R Street, a think tank that develops policy solutions, believes the issue reflects broader political debates about the First Amendment. She believes that those on both sides of the partisan debate "hate" Section 230, "...because it allows for free speech in ways they don’t like."

Nonetheless, Weissmann believes that the recent judicial decisions suggest that Section 230 will remain law for the foreseeable future.

In the meantime, Stackhouse said that as raters, "...our responsibility is to make sure that things are absolutely truthful, accurate and unbiased. We are basically the gatekeepers of the information. If the information flows out from us, and it’s inaccurate, biased or plain misleading, we've not done our job."

Besides pay rates and benefits, raters care about the quality of their product. This concept is a part of a larger conversation AWU wants to have with Google. According to Stackhouse, he and his peers, “...believe that, under ideal circumstances, we can make a difference.”

Click X to close