• A former Facebook moderator described disturbing imagery she says she would have to remove from the site every day.
  • In an interview with the BBC, the unnamed woman said she had to make quick decisions about taking down photos and videos containing beheadings, animal abuse, and child pornography.
  • She said the work gave her nightmares, and she accused Facebook of not providing enough support to staff members.
  • Facebook's head of global policy management told the BBC that graphic content was "a small fraction of what reviewers might see" and that the company is committed to giving moderators the tools to do well.

A former Facebook moderator described to the BBC the horrors she was exposed to every day and criticized the social network for not doing enough to support staff handling disturbing imagery.

The content reviewer, who worked in a Facebook center in Berlin, spoke to the BBC on the condition of anonymity.

Now Playing:

For years, Facebook kept its internal policy guidelines under wraps because "they didn't want people to game the system," says Axios' Sara Fischer. On Tuesday, the social media network changed course and made those guidelines public, after backlash over how it handles sensitive content.

Media: Cheddar TV

She told the BBC that she would have five seconds to decide whether to remove some disturbing photos and videos. Among the worst images, she said, were beheadings, animal abuse, and child pornography.

The woman suggested that the work affected her mental health, describing a vivid nightmare she had during her time at Facebook:

"I had nightmares a couple of times. I remember one, for example — people jumping from a building. I don't know why. And I remember people, instead of helping the people jumping, they were just taking photos and videos ... I woke up crying."

She accused Facebook of not providing enough support to content reviewers and said staff members regularly complained to management.

"It's the most important job in Facebook, and it's the worst, and no one cares about it," she said.

In a message directed at Facebook CEO Mark Zuckerberg, she added: "How are you allowing this to happen? That young people like us are having to see these things, but we were treated like nothing."

Facebook did not immediately respond to Business Insider's request for comment.

Monika Bickert, Facebook's head of global policy management, acknowledged to the BBC that Facebook moderating was difficult work and said support systems were in place for employees.

"This work is hard, but I will say that the graphic content is a small fraction of what reviewers might see," she said. "Increasingly we've been able to use technology to review and remove some of the worst content."

Bickert added: "We're committed to giving them what they need to do this job well. If they're ever uncomfortable at work, there are counseling resources for them, and they can be shifted to work on a different kind of content."

Facebook this week published the internal guidelines its moderators use. The document is 8,500 words and goes into detail about what is and isn't allowed, including its policies about sexual or violent content and hate speech.

Facebook is increasingly relying on artificial intelligence to identify offending items on its site. But Zuckerberg said on Wednesday that it was "easier to build an AI system to detect a nipple than what is hate speech."

Join the conversation about this story »

NOW WATCH: Jared Kushner and Ivanka Trump tried to cut a secret deal with Planned Parenthood — here's what happened

See Also:

SEE ALSO: AI is great at recognizing nipples, Mark Zuckerberg says