Viral mosque shooting video raises questions about social media firms’ responsibilities

  • thepeels
  • March 15, 2019
  • Comments Off on Viral mosque shooting video raises questions about social media firms’ responsibilities

The Facebook livestreaming and subsequent widespread sharing of a shooting that killed 49 people at two mosques in Christchurch, New Zealand, is raising questions about social media firms’ abilities and responsibilities to stop their platforms from being used to propagate hate and inspire violence.

The attack during Friday prayers on March 15 was recorded and livestreamed on Facebook, apparently by the attacker, until police contacted the social media company to remove the video.

Philip Mai, director of business and communications at Ryerson University’s Social Media Lab, said it does appear that the original video was taken down faster than in previous incidents like this.

But he noted that it still took some time because it required police to intervene.

“By then, the damage has already been done,” he said.

The video, 17 minutes long at its full length, was subsequently shared on various social media platforms, including Twitter and YouTube, for hours, despite police appeals not to share the videos and social platforms’ reported attempts to stamp out circulating copies.

BuzzFeed tech reporter Ryan Mac reported that he was will seeing copies circulating 18 hours later.

Mai said social media sites are often able to remove content such as music videos that they believe violate someone’s copyright far more proactively and automatically, using artificial intelligence.

And Facebook announced Friday that it was planning to use artificial intelligence to automatically flag “revenge porn” for removal.

“The technology’s there,” Mai said. “But for whatever reason, this kind of thing is not being flagged as quickly as other types of content.”

He acknowledged that live videos are unique and may be harder to detect and flag than content like music videos, but he said there have been enough such incidents to program a computer to watch for certain patterns.

Need for regulation?

However, putting in place such a system, along with a process for people to appeal removal, and hiring humans to make the final call could be expensive, he added. It’s something companies may choose not to do if it isn’t required by law.

I think that governments should be looking into laws and have a public debate as to what responsibility these companies have,” he said. “What does society want from these companies and what do we need to impose on them?”

Stephanie Carver, an assistant professor of international affairs who researches terrorism, told CBC News Network she thinks that governments should be asking more questions of social media companies and their role in making it easy to share extremist information.

While many social media companies have started taking Islamist extremist content more seriously and dealing with it, she said, “they’ve been far less willing” to do that with far-right extremism, and governments should ask why.

“And do we need some kind of regulation going forward that … not just forces them to adhere to the law, but to their own stated standards about violence and hateful rhetoric?” she said.

WP Facebook Auto Publish Powered By : XYZScripts.com