Multimedia playback is not supported on your device
Media captionChristchurch was locked up as events unfolded
An armed man opened fire in a mosque in Christchurch, New Zealand, killing 49 people and injuring 20 others. In doing so, he filmed the entire crime and broadcast it live directly to Facebook.
It resulted in an exhausting race for social media pages to remove the images, as they were seemingly endless reproductions and were widely shared as a result of the attack.
And through social media, it has found itself in the front pages of some of the biggest news websites in the form of still images, gifs and even full video.
This series of events has, once again, shed light on how sites such as Twitter, Facebook, YouTube and Reddit are attempting – and failing – to fight extremist right-wing extremism on their platforms.
As the video continued to air, other members of the public were posting their own messages inviting people to stop sharing it.
One of them stressed: "This is what the terrorist wanted".
What has been shared?
The video, which presents a first-person view of the massacres, has been widely disseminated.
How did people react?
While a large number of people have duplicated and shared images online, many others have reacted with disgust – urging others not only not to share images, but even to watch them.
Many said that broadcasting the video was what the attacker had asked people to do.
Many people were particularly angry at the media that published these images.
Channel 4 News presenter Krishnan Guru-Murthy named two British newspaper websites and accused them of hitting "a new clickbait floor".
Buzzfeed reporter Mark Di Stefano also wrote that MailOnline had authorized readers to download the attacker's 74-page "manifesto" from his news bulletin. The website subsequently withdrew the document and issued a statement saying that it was "an error".
The Daily Mirror's editor, Lloyd Embley, also tweeted that they had removed the footage and that its publication was "Does not match our policy on terrorist propaganda videos".
How did the social media companies react?
All social media companies have expressed their sincere condolences to the victims of large-scale shootings, reiterating that they were acting swiftly to remove inappropriate content.
Facebook said: "The New Zealand police alerted us to a video on Facebook soon after the start of live streaming and we deleted the Facebook account and the shooter's video.
"We are also removing any praise or support for the crime and the shooter (s) as soon as we become aware of it.We will continue to work directly with the New Zealand Police as long as their response and investigation continues."
And in a tweet, YouTube said "our hearts are broken," adding that he "was working vigilantly" to remove any violent footage.
With regard to what they have done in the past to counter the threat of extreme right-wing extremists, the approach of social media companies has been more nuanced.
Twitter acted to remove the right-right accounts in December 2017. Previously, it suppressed and then reinstated the account of Richard Spencer, a white American nationalist who had popularized the term "alternative right".
Facebook, which had suspended Mr. Spencer's account in April 2018, had admitted at the time that it was difficult to distinguish between hate speech and legitimate political speech.
This month, YouTube was accused of being incompetent or irresponsible for processing a video promoting the banned neo-Nazi group National Action.
UK MP Yvette Cooper said the streaming video platform had repeatedly promised to block her, but only for her to reappear on the service.
What should happen next?
Dr. Ciaran Gillespie, a political scientist at Surrey University, thinks the problem goes well beyond a video, as shocking as the content was.
"It's not just about broadcasting a live massacre, social media platforms have rushed to stop this massacre, and they can not do much about sharing it because of the nature of the platform, go ahead, "he said.
Copyright of the image
At least 49 people were killed in two mosques in Christchurch
As a political researcher, he uses "a lot" YouTube and says that it is often recommended to use far right content.
"There are oceans of this content on YouTube and there is no way to estimate how much YouTube has dealt with the threat posed by Islamic radicalization because it is considered clearly unethical. but the same pressure does not exist for – just content, even if it poses a similar threat.
"YouTube will increasingly be called on to stop promoting racist and far right channels and content."
Dr. Bharath Ganesh, a researcher at the Oxford Internet Institute, echoes his views.
"Getting the video is obviously the right thing to do, but the social media sites have given the far right organizations a place to talk and no consistent or integrated approach has been put in place to get them there. face.
"There has been a tendency to err on the side of freedom of expression, even when it is clear that some people are spreading toxic and violent ideologies."
Social media companies must now "take the threat posed by these ideologies much more seriously," he added.
"This could mean creating a special category for right-wing extremism, recognizing that it has global reach and networks."
Neither of these underestimates the magnitude of the task, especially as many of the representatives of far-right viewpoints are followers of what Dr. Gillespie calls "a legitimate controversy".
"People will discuss the threat posed by Islam and recognize that it is a contentious issue, while stressing that it is legitimate to discuss," he said.
Social media companies will have a lot of trouble tackling these shadowy areas, but after the tragedy unfolding in New Zealand, many believe that we need to redouble our efforts. 39; efforts.