Facebook content moderation is an ugly business. Here’s who does it

Facebook office in Berlin

Staff sit at computers inside a Facebook content moderation center in Berlin. 

Soeren Stache/Getty Images

Some of the workers saw video of a man being stabbed to death. Others viewed acts of bestiality. Suicides and beheadings popped up too.

The reason for watching the gruesome content: to determine whether it should be pulled from Facebook before more members of the world’s largest social network could see it.

Content moderators protect Facebook’s 2.3 billion users from exposure to humanity’s darkest impulses. Swarming through posts that’ve been flagged by other members of the social network or by the Silicon Valley giant’s artificial intelligence tools, they quickly decide what stays up and what comes down. But reviewing the posts doesn’t come without cost. Constant exposure to violence, hatred and sordid acts can wreak havoc on a person’s mental health. One lawsuit has already been filed by a content moderator who claims she developed post-traumatic stress disorder. There’s a reason being a content moderator has been called “the worst job in technology.”

It’s also an important job, and one that isn’t handled by Facebook employees. Instead, it’s outsourced to contractors, some of whom turn to drugs and sex in the workplace to distract themselves from the abhorrent images they see every day, according to a recent story in The Verge, which reported that some of the workers make as little as $28,800 per year. That’s just over the federal poverty level for a family of four.

Contracting in the tech industry has reached a flashpoint, escalating tensions in Silicon Valley’s world of haves and have-nots. Contractors and temps don’t get the health care or retirement benefits that full-time employees do, a difference that hasn’t gone unnoticed. Last year, contract workers at Google protested, demanding higher wages and benefits.

In a statement, Facebook defended the use of contractors, saying it gives the social network flexibility in where to concentrate its efforts.

“We work with a global network of partners, so we can quickly adjust the focus of our workforce as needed,” a Facebook spokeswoman said in a statement. “For example, it gives us the ability to make sure we have the right language expertise — and can quickly hire in different time zones — as new needs arise or when a situation around the world warrants it.”

Here’s a look at five of the companies Facebook works with to police content.


A multinational provider of services to technology, finance, health care, retail and other companies, Cognizant offers services including app development, consulting, information technology and digital strategy.

Based in Teaneck, New Jersey, Cognizant has roughly 281,600 employees around the world, according to its annual report. Nearly 70 percent of its workforce is in India.

The company’s role in supporting Facebook’s content moderation activities was the subject of the recent story in The Verge, which reported that roughly 1,000 Cognizant employees at its Phoenix office evaluate posts for potentially violating Facebook rules against hate speech, violence and terrorism.

Indian Economy Business Images

Cognizant Technology Solutions office in Chennai, India. The company works with Facebook on content moderation. 

Madhu Kapparath/Getty Images

The workers get two 15-minute breaks, a 30-minute lunch and nine minutes of “wellness time” per day. They also have access to counselors and a hotline, according to the report.

Still, some workers said that constant exposure to depravity has taken its toll. One former content moderator said he started to believe conspiracy theories, such as 9/11 being a hoax, after reviewing videos promoting the idea that the terrorist attack was faked. The former employee said he had brought a gun to work because he feared that fired employees would return to the office to harm those who still had jobs.

Cognizant said it looked into “specific workplace issues raised in a recent report,” that it had “previously taken action where necessary” and that it has “steps in place to continue to address these concerns and any others raised by our employees.”

The company outlined the resources it offers employees, including wellness classes, counselors and a 24-hour hotline.

“In order to ensure a safe working environment, we continuously review our workplace offerings, in partnership with our clients, and will continue to make necessary enhancements,” Cognizant said in a statement.

PRO Unlimited

Based in Boca Raton, Florida, PRO Unlimited provides services and software used by clients in more than 90 countries. One of its clients: Facebook.

Last year, Selena Scola, a former PRO Unlimited employee who worked as a Facebook content moderator, filed a lawsuit against the two companies alleging that she suffered from psychological trauma and post-traumatic stress disorder caused by viewing thousands of disturbing images of violence. Scola’s PTSD symptoms can pop up when she hears loud noises or touches a computer mouse, according to the lawsuit.

“Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator,” the lawsuit states.

Filed in superior court in Northern California’s San Mateo County, the lawsuit alleges Facebook violated California law by creating dangerous working conditions. Facebook content moderators are asked to review more than 10 million posts per week that may violate the social network’s rules, according to the lawsuit, which seeks class-action status.

PRO Unlimited didn’t respond to a request for comment. At the time the lawsuit was filed, Facebook acknowledged the work can be stressful and said it requires the company it works with for content moderation to provide support such as counseling and relaxation areas.

Facebook in a court filing denied Scola’s accusations and called for the case to be dismissed. 


One of the most prestigious consultancies in the world, Dublin-based Accenture has more than 459,000 people serving clients across 40 industries and in more than 120 countries, according to its website.

People enter an Accenture office in down

People enter an Accenture office in downtown Helsinki.

Jussi Nukari/Getty Images

In February, Facebook content reviewers at an Accenture facility in Austin, Texas, complained about a “Big Brother” environment, alleging they weren’t allowed to use their phones at their desk or take “wellness” breaks during the first and last hour of their shift, according to a memo obtained by Business Insider.

“Despite our pride in our work, Content Moderators have a secondary status in [the] hierarchy of the workplace, both within the Facebook and the Accenture structure,” the memo read.

Accenture didn’t respond to a request for comment. At the time, Facebook said there had been a “misunderstanding” and that content moderators are encouraged to take wellness breaks at any time throughout the day.

Some of Accenture’s clients have included other tech giants such as Google, Microsoft and Amazon. More than three-quarters of Fortune Global 500 companies work with Accenture.  


One of Facebook’s largest content moderation centers is in Germany, a country that started enforcing a strict hate speech law last year that would fine social media companies up to 50 million euros ($58 million) if they didn’t pull down hate speech and other offensive content quickly enough.

Arvato, owned by the German media company Bertelsmann, runs a content moderation center in Berlin. The company has faced complaints about working conditions and the toll the job takes on workers’ mental health.

In 2017, Arvato said in a statement that it takes the well-being of its employees seriously and provides health care and access to company doctors, psychologists and social services.


Now playing:
Watch this:

Facebook is putting women on the front line of its war…


The company, based in Gütersloh, Germany, has 70,000 employees in more than 40 countries. It’s been providing Facebook with content moderation services since 2015.

Arvato, which was rebranded last week as Majorel, said it offers content moderators a salary that’s 20 percent above minimum wage and support such as wellness classes and counselors. Workers can also take “resiliency breaks” at any time of the day.

“We are proud to be a partner of Facebook and work in alignment with them to offer a competitive compensation package that includes a comprehensive benefits package,” a company spokesperson said in a statement. “We will continue to work together to improve our offerings and support of our employees.”


New York-based professional services firm Genpact won a contract with Facebook last year to provide content moderation, according to The Economic Times.

Concerns about the mental health of Facebook content moderators weren’t enough to scare off applicants in India, who flocked to jobs that paid between 225,000 and 400,000 rupees a year (about $3,150-$5,600). Genpact was searching for content moderators fluent in Tamil, Punjabi and other Indian languages.

Some Genpact workers have complained about low pay and a stressful work environment, according to a report this week by Reuters. One former Genpact employee told the news outlet that at least three times he’s “seen women employees breaking down on the floor, reliving the trauma of watching suicides real-time.”

Facebook pushed back against allegations of low pay but outlined the work it was doing to improve working conditions for content moderators.

In an email, a Genpact spokesperson confirmed that it partners with Facebook but said it doesn’t comment on work with clients.

“As a company we bring our extensive experience in the field of content review and operations to our partners by providing industry-leading support for our team of content reviewers and a best-in-class working environment,” the Genpact spokesperson said in a statement. “We take very seriously this work and the services that we provide to our clients.”

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *