A Facebook moderator says she took down beheadings, child pornography, and animal abuse every day — but was 'treated like nothing'
Published 2:33 am, Thursday, April 26, 2018
Monika Bickert, Facebook's head of global policy management, acknowledged to the BBC that Facebook moderating was difficult work and said support systems were in place for employees. (Getty Images)
Monika Bickert, Facebook's head of global policy management, acknowledged to the BBC that Facebook moderating was difficult work and said support systems were in place for employees. (Getty Images)
However, according to the Washington Post, Facebook is also able track other data points given the information you enter. These data points include income, net worth, home ownership, home value and even property size, square footage of home, and the year your home was built.
lessHowever, according to the Washington Post, Facebook is also able track other data points given the information you enter. These data points include income, net worth, home ownership,
... moreThe company states on its Data Policy page that it tracks any payments you make on Facebook (when you click an ad), including games and donations. Per Facebook, this includes "credit or debit card number and other card information, and other account and authentication information, as well as billing, shipping and contact details."
lessThe company states on its Data Policy page that it tracks any payments you make on Facebook (when you click an ad), including games and donations. Per Facebook,
... moreFacebook states that they "collect information" about phones, computers, tablets and any other devices that you download their services on. Per Facebook, this information includes device settings, battery and signal strength, specific geographic locations of devices through GPS, Bluetooth, or WiFi signals, language, time zone, mobile phone number and IP address.
lessFacebook states that they "collect information" about phones, computers, tablets and any other devices that you download their services on. Per Facebook, this
... more- A former Facebook moderator described disturbing imagery she says she would have to remove from the site every day.
- In an interview with the BBC, the unnamed woman said she had to make quick decisions about taking down photos and videos containing beheadings, animal abuse, and child pornography.
- She said the work gave her nightmares, and she accused Facebook of not providing enough support to staff members.
- Facebook's head of global policy management told the BBC that graphic content was "a small fraction of what reviewers might see" and that the company is committed to giving moderators the tools to do well.
A former Facebook moderator described to the BBC the horrors she was exposed to every day and criticized the social network for not doing enough to support staff handling disturbing imagery.
The content reviewer, who worked in a Facebook center in Berlin, spoke to the BBC on the condition of anonymity.
For years, Facebook kept its internal policy guidelines under wraps because "they didn't want people to game the system," says Axios' Sara Fischer. On Tuesday, the social media network changed course and made those guidelines public, after backlash over how it handles sensitive content.
Media: Cheddar TVShe told the BBC that she would have five seconds to decide whether to remove some disturbing photos and videos. Among the worst images, she said, were beheadings, animal abuse, and child pornography.
The woman suggested that the work affected her mental health, describing a vivid nightmare she had during her time at Facebook:
"I had nightmares a couple of times. I remember one, for example — people jumping from a building. I don't know why. And I remember people, instead of helping the people jumping, they were just taking photos and videos ... I woke up crying."
She accused Facebook of not providing enough support to content reviewers and said staff members regularly complained to management.
"It's the most important job in Facebook, and it's the worst, and no one cares about it," she said.
In a message directed at Facebook CEO Mark Zuckerberg, she added: "How are you allowing this to happen? That young people like us are having to see these things, but we were treated like nothing."
Facebook did not immediately respond to Business Insider's request for comment.
Monika Bickert, Facebook's head of global policy management, acknowledged to the BBC that Facebook moderating was difficult work and said support systems were in place for employees.
"This work is hard, but I will say that the graphic content is a small fraction of what reviewers might see," she said. "Increasingly we've been able to use technology to review and remove some of the worst content."
Bickert added: "We're committed to giving them what they need to do this job well. If they're ever uncomfortable at work, there are counseling resources for them, and they can be shifted to work on a different kind of content."
Facebook this week published the internal guidelines its moderators use. The document is 8,500 words and goes into detail about what is and isn't allowed, including its policies about sexual or violent content and hate speech.
Facebook is increasingly relying on artificial intelligence to identify offending items on its site. But Zuckerberg said on Wednesday that it was "easier to build an AI system to detect a nipple than what is hate speech."
Join the conversation about this story »
See Also:
- A single high school in India has produced the CEOs of Microsoft, Adobe, and Mastercard
- 25 of the most dangerous things science has strongly linked to cancer
- The data scientist behind the Cambridge Analytica scandal did paid consultancy work for Facebook and has close ties to staff
SEE ALSO: AI is great at recognizing nipples, Mark Zuckerberg says