Business Technology

Sunday 21 April 2019

17 minutes of carnage: New Zealand massacre shows how online users find ways to share violent videos

  • Facebook reveals it removed 1.5m videos of the attack in 24 hours
  • Politicians demand answers amid calls for an end to live-streaming
The live streaming on Facebook of the Christchurch atrocity, allegedly by Branton Tarrant (bottom left) has led to a string of demands for controls. NZ Prime Minister Jacinda Ardem (right)
The live streaming on Facebook of the Christchurch atrocity, allegedly by Branton Tarrant (bottom left) has led to a string of demands for controls. NZ Prime Minister Jacinda Ardem (right)

Paresh Dave and Munsif Vengattil

New Zealand Prime Minister Jacinda Ardern said Facebook's chief operating officer Sheryl Sandberg has sent condolences over the shootings at two mosques that killed 50 people, some of which were live-streamed over the social media platform.

"Certainly, I have had contact from Sheryl Sandberg. I haven't spoken to her directly but she has reached out, an acknowledgment of what has occurred here in New Zealand," Ardern said a media conference when asked if Facebook should stop live-streaming.

People pause next to flowers and tributes by the wall of the Botanic Gardens on March 17, 2019 in Christchurch, New Zealand. Photo by Carl Court/Getty Images
People pause next to flowers and tributes by the wall of the Botanic Gardens on March 17, 2019 in Christchurch, New Zealand. Photo by Carl Court/Getty Images
New Zealand Prime Minister Jacinda Ardern meets representatives of the Muslim community at Canterbury refugee centre in Christchurch. New Zealand Prime Minister's Office/Handout via REUTERS.

"This is an issue that I will look to be discussing directly with Facebook," Ardern said, adding Sandberg has shared condolences over the shootings in Christchurch on Friday

Facebook has said it removed 1.5 million videos of the attack worldwide in the 24 hours after the shootings, 1.2 million of which were blocked at upload.

Mia Garlick, of Facebook New Zealand, said: "We continue to work around the clock to remove violating content using a combination of technology and people".

The Friday massacre at two New Zealand mosques, live-streamed to the world, was not the first internet broadcast of a violent crime, but it showed that stopping gory footage from spreading online persists as a major challenge for tech companies despite years of investment.

A still image taken from the livestream. Photo: Reuters
A still image taken from the livestream. Photo: Reuters

The massacre in Christchurch was live-streamed by an attacker through his Facebook profile for 17 minutes, according to a copy seen by Reuters. Facebook said it removed the stream after being alerted to it by New Zealand police.

But a few hours later, footage from the stream remained on Facebook, Twitter and Alphabet Inc's YouTube, as well as Facebook-owned Instagram and WhatsApp. It also remained available on file-sharing websites such as New Zealand-based Mega.nz.

People who wanted to spread the material had raced to action, rapidly repackaging and distributing the video across many apps and websites within minutes.

Facebook, Twitter, YouTube and Mega on Friday said they were taking action to remove the copies.

Other violent crimes that have been live-streamed include a father in Thailand in 2017 who broadcast himself killing his daughter on Facebook. After more than a day, and 370,000 views, Facebook removed the video.

In the United States, the assault in Chicago of an 18-year-old man with special needs, accompanied by anti-white racial taunts, in 2017, and the fatal shooting of a man in Cleveland, Ohio, that same year, were also live-streamed.

Facebook, the world's largest social media network with about 2.3 billion monthly users around the world, tripled the size of its safety and security team to 30,000 people over the last three years to respond more quickly to reports of offensive content. It has also focused on developing artificial intelligence systems to catch material without the need for users to report it first.

But the viral reach of yet another obscene video caused politicians around the globe on Friday to voice the same conclusion: Tech companies are failing.

CHARGED: Brenton Tarrant was charged yesterday for murder in relation to last week’s mosque attacks
CHARGED: Brenton Tarrant was charged yesterday for murder in relation to last week’s mosque attacks

At least some expect Facebook to suffer consequences.

Facebook "helped provide a platform for today's horrific attack and will undoubtedly be called into question for facilitating the spread of this," said Clement Thibault, analyst at financial data website Investing.com.

The company's profit margins fell last year as it spent to address the challenge, and stock analysts are bracing for further short-term hits to its profitability, whether or not regulations materialize and despite relatively few alternatives for advertisers.

EVADING DETECTION

Users intent on sharing the violent video took several approaches. Copies reviewed by Reuters showed that some users had recorded the video playing on their own phones or computers to create a new version with a digital fingerprint different from the original to evade companies' detection systems. Others shared shorter sections or screenshots from the gunman’s livestream. The shooting begins about six minutes into a 17-minute video reviewed by Reuters. It starts with the attacker driving to a mosque.

On internet discussion forum Reddit, users strategized to avoid the actions of moderators, directing each other to video apps which had yet to take action and sending footage through messaging apps.

Besides acting on user complaints about copies, YouTube said on Friday that it was trying to identify copies with an automated tool that finds videos likely to be violent in nature based on a combination of the title and description of the video, the characteristics of the user uploading it and objects in the footage.

Exact matches of removed material cannot be uploaded again at YouTube and Facebook.

Facebook said it, too, was relying on user complaints and an artificial intelligence system to identify violent footage and send it to moderators.

It also was using audio technology to detect Christchurch broadcast footage, in which gunshots could be heard and music played in the attacker's car, according to a copy reviewed by Reuters.

Researchers and entrepreneurs specializing in detection systems said they were surprised that users in the initial hours after the attack were able to circumvent Facebook's tools.

Joshua Buxbaum, chief executive of Irvine, California-based moderation technology company WebPurify, said Facebook and other services could employ image recognition or other types of AI to identify copies in additional ways.

"I would certainly think given the budgets they have that they would have the ability to root out these videos," Buxbaum said.

Experts said the companies could set their detection tools and removal processes to be more aggressive, but YouTube and Facebook have said they want to be careful not to remove sensitive videos that either come from news organizations or have news value.

Politicians in multiple countries said social media companies need to be more vigilant.

"This is a case where you’re giving a platform for hate," Democratic U.S. Senator Cory Booker, who is running for president, said at a campaign event in New Hampshire. "That’s unacceptable, it should have never happened, and it should have been taken down a lot more swiftly."

Britain's interior minister, Sajid Javid, said on Twitter, "Enough is enough."

Online Editors

Also in Business