Facebook, YouTube and Twitter are struggling to put an end to the dreaded filming video in New Zealand several hours after the attack, after being caught off guard.

One of the shooters seems to have broadcast the attack on Facebook (FB). The disturbing video, which has not been verified by CNN, lasted nearly 17 minutes and would aim to show the shooter who was heading for a mosque and opened fire.

"The New Zealand police alerted us to a video on Facebook soon after the start of the livestream, and we quickly deleted the shooter's Facebook and Instagram accounts, as well as the video," said Mia Garlick, Director of Policy for the United States. 39, Australia and New Zealand on Facebook, in a statement.

Facebook declined to comment on exactly when the video was removed.

What we know

Hours after the attack, however, horrific video copies continued to appear on Facebook, YouTube and Twitter, raising new questions about the ability of companies to manage harmful content on their platforms.

49 people killed during a shootout in two mosques in Christchurch, New Zealand

"We suppress any praise or support for the crime and the shooter (s) as soon as we are aware of it," said Garlick.

Twitter (TWTR) has said it has suspended an account related to the shoot and use to remove the video from its platform.
According to a spokesman for Google, YouTube, which is owned by Google (GOOGL), removes "shocking, violent and graphic content" as soon as they become aware of it. YouTube also declined to comment on the time required to first remove the video.

The New Zealand police have asked social media users to stop sharing the alleged footage of the shoot and said they were looking for these videos to be taken.

CNN chooses not to post additional information on the video until additional information is available.

Technology companies "do not see this as a priority"

This is the latest case of social media companies caught unawares by murderers posting videos of their crimes and by other users sharing troubling images. This has occurred in the United States, Thailand, Denmark and other countries.

Friday's video rekindles questions about how social media platforms handle offending content: are companies doing enough to capture this type of content? How fast should they expect to remove it?

"While Google, YouTube, Facebook and Twitter all say that they cooperate and act in the best interest of citizens to remove this content, it's not because they let these videos reappear all over the world. time, "said Lucinda Creighton, senior advisor of the Counter Extremism Project, an international policy organization.

YouTube says it's going to crack down on the recommendation of conspiracy videos

Facebook's artificial intelligence tools and human moderators were apparently unable to detect the live stream of the shooting. The company claims to have been alerted by the New Zealand police.

"Technology companies do not see this as a priority, they are wringing their hands, they say it's terrible," Creighton said. "But what they do not do, is prevent it from reappearing."

John Battersby, an anti-terrorism expert at Massey University in New Zealand, said the country was spared by mass terrorist attacks, in part because of its isolation. Social media has changed that.

"This man broadcast the live shootings and his supporters have encouraged him, most of them are not in New Zealand," he said. "Unfortunately, once the software is online and downloaded, it can still be (online)," he added.

The broadcast of the video could inspire imitators, said Steve Moore, law enforcement analyst at CNN, the FBI's retired special surveillance officer.

"What I would like to say to the public, is this: Do you want to help the terrorists? Because if you do, share this video is exactly what you are doing," Moore said.

"Do not share the video or be part of it," he added.

Hadas Gold, Donie O. Sullivan, Samuel Burke and Paul Murphy contributed to this report.