Facebook to start allowing appeals for removed posts as part of transparency effort
Facebook is to start allowing appeals for posts that are removed for nudity, sexual activity, hate speech or graphic violence.
The social media giant, which has been under fire in recent months for how users’ data has been manipulated, says that it is introducing the measure in an attempt to make its processes more transparent.
It has also published the internal guidelines it uses when deciding how to enforce its standards.
“One of the questions we're asked most often is how we decide what’s allowed on Facebook,” said Monika Bickert, Facebook’s vice president of global product management.
”For years, we’ve had Community Standards that explain what stays up and what comes down. Today we’re going one step further and publishing the internal guidelines we use to enforce those standards. And for the first time we’re giving you the right to appeal our decisions on individual posts, so you can ask for a second opinion when you think we’ve made a mistake.”
Under the appeals system, if a photo, video or post has been removed because it violates Facebook’s Community Standards, the user will be notified and given the option to request “additional review”.
“This will lead to a review by our team, which is always by a person, typically within 24 hours,” said Bickert. “If we've made a mistake, we will notify you, and your post, photo or video will be restored.”
Ms Bickert said that Facebook intends to extend the process ”by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up”.
She said that the company was publishing its internal guidelines on how it comes to decide on posts being removed to help users understand why it takes such decisions.
“First, the guidelines will help people understand where we draw the line on nuanced issues,” she said.
“Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines – and the decisions we make – over time.”
“In some cases, we make mistakes because our policies are not sufficiently clear to our content reviewers,” said Ms Bickert.
“When that's the case, we work to fill those gaps. More often than not, however, we make mistakes because our processes involve people, and people are fallible. It’s a challenge to accurately apply our policies to the content that has been flagged to us.”