"We don't check what people say before they say it, and frankly, I don't think society would want us to do that. Freedom means that you don't have to ask for permission first and that you can say what you want by default. "
That's Mark Zuckerberg, circa September 2017. At that time, the Facebook CEO described his company's response to Russia's attempt to manipulate the last presidential election.
I suppose he still supports these comments today, after a shooter used Facebook to stream his massacre of dozens of people in New Zealand live. Although I have asked Facebook PR to be sure.
But regardless of how Facebook – and Twitter and YouTube and Reddit and the other platforms that helped distribute images and videos from yesterday's mass shots in Christchurch, which authorities have deemed a terrorist attack – respond to criticism about their current roles today, the most important thing to remember about the platforms is that they did exactly what they were meant for: let people share what they want, whenever they want, to as many people as they want.
I don't want to be easy about this: Facebook naturally does not want murderers to stream their crimes live anywhere in the world. But the company has built a tool with which they can do exactly that. And it's on a platform that is basically built to let people say what they want without first asking permission.
As I wrote in 2017, this platform structure is the key to Facebook's tremendous success as a company – users deliver the content and Facebook software spreads it globally, instantly, with minimal friction:
Facebook only works as a giant billion-plus-plus company because it allows users and advertisers to upload what they want to its platform, without human intervention. And the fact that Facebook does not fill the comments, advertisements or (almost) everything else of others before it goes up is also what it offers a lot of legal protection, especially in the US: if there is something unpleasant or illegal on Facebook, it is not because Facebook has put it there – someone has put it on Facebook.
This set-up is not unique to Facebook. All the huge consumer platforms that have emerged from Silicon Valley in the last decade work the same way: YouTube and Twitter don't sign up for your comments or videos before you upload them, and Airbnb doesn't check you before you rent space in your house.
As Zuckerberg noted in 2017, it wants to remove offensive content after it comes up, and the company says it deleted the shooter account shortly after the live stream. The company also says it will spend billions on a combination of software and people to combat abuse in the future.
Last week, Zuckerberg announced plans to shift Facebook's attention from a public newsfeed to more personal, encrypted communication. But at the end of Facebook's planned pivot, it would still allow the New Zealand shooter to do exactly what he did yesterday.
It is possible that Facebook's shift would reduce the virality of recording footage or other awful things, but it would not prevent things from going onto the platform. It is also possible that Facebook would have a harder time checking, because the company plans to provide full encryption for the messages people pass back and forth.
But per Zuckerberg – and again, because Facebook is built this way – Facebook will check the abuse on its platform as soon as it happens, in the same way that the police respond to a crime once they are aware of it.
Here is a larger passage from the essay by Zuckerberg that I quoted at the beginning of this story:
Now, I am not going to sit here and say that we will catch all the bad content in our system. We don't check what people say before they say it, and frankly I don't think our society would want that. Freedom means that you do not have to ask permission first and that you can say what you want by default. If you break our community standards or the law, you will experience the consequences afterwards.
It is hard to imagine what consequences Facebook can impose on someone who killed dozens of people on Friday. And it's hard to imagine that this won't happen again.