Thursday 14 December 2017

So how long did it take tech giant to react to our complaints of offence?

(Stock picture)
(Stock picture)

Ian Begley

Facebook is facing major criticism following the length of time it takes it to remove offensive and illegal content.

To test how efficient its security system is, I reported a total of six Facebook pages which I thought violated its 'Community Standards'.

They were:

  • Weed in Dublin;
  • Irish Escorts;
  • Columbine Hero (a page glorifying the notorious Columbine High School shooters);
  • Everybody draw Mohammed Day;
  • Pro-Fat Shaming;
  • I f***ing hate Islam.

This week the social network took more than 24 hours to take down the horrific live-stream video of a 20-year-old Thai man murdering his 11-month-old daughter.

This video was viewed by nearly 400,000 people before being removed.

A week earlier, the Facebook Live murder of an elderly man in Cleveland, Ohio, was taken down two hours after it was posted.

I wanted to see how long it took Facebook to respond to the pages I had flagged.

The process was a lot easier than I expected.

Once you click the 'report' button on the offending page you are given a list of five options why you deem it inappropriate.

If you select 'I don't think it should be on Facebook', it gives you another list of options asking why.

These range from sexually explicit content, hate speech or involving the buying and selling of illegal products.

After you make your selection, Facebook gives you a choice to submit the page for review or instructions on how to block it from appearing on your timeline.

In under an hour, three of the reports I made on pages were dismissed as they were not deemed to violate its community standards.

The pages were 'Everybody draw Mohammed Day', 'Pro-Fat Shaming' and 'I f***ing hate Islam'.

I also received an automatic message following its response.

"We've looked over the page that you reported, and although it doesn't go against any of our specific community standards, we understand that the page or something shared on it may still be offensive to you.

"We want to help you avoid things that you don't want to see on Facebook."

Although I was surprised that these pages did not get taken down, the three others I reported took considerably longer for Facebook to make a decision.

Eventually, after approximately 15 hours, the social network informed me that 'Weed in Dublin', 'Irish Escorts' and 'Columbine Hero' were removed.

I also received an automatic report from the social network following its decision for each page.

One read: "We've reviewed the page that you reported for promoting graphic violence. As it violated our community standards, we've removed it.

"Thanks for your report. We've let Columbine Hero know that their page has been removed, but not who reported it."

Fianna Fáil spokesman on communications Timmy Dooley said: "It seems to me that the process of reporting unacceptable content on Facebook is too cumbersome, and that Facebook don't make it easy for users to ensure that inappropriate content is removed in a timely and efficient manner."

Mark Zuckerberg has pledged to do more to prevent offending and illegal content from being posted or streamed live on its site.

A spokesperson for Facebook told the Irish Independent: "We have built an extensive reporting infrastructure that enables people to report other individuals and suspicious activity quickly.

"Every piece of content, profile, group or page on Facebook can be reported to us and is reviewed on a 24/7 basis by members of our large, expertly trained community operations team based in locations around the world."

Irish Independent

Promoted Links

Business Newsletter

Read the leading stories from the world of Business.

Promoted Links

Also in Business