They call it confirmation bias. We’ve all seen it, yet those that know it exists will forever be in the minority. In fact, confirmation bias may have swayed the election in Trump’s favor by eliciting emotion that closely aligned with the pissed off alt-right’s perception of both candidates.
Facebook, which still claims it isn’t a media company, is the fuel that feeds the flame.
New York, meet the world’s tech scene
5,000 Tech leaders are coming to NYC this November to learn and do business. This is your chance to join them.
It’s Facebook, after all, that unceremoniously cut ties with human editors in favor of algorithms. If you thought algorithms were the answer, you were quickly proven wrong. Story after story details their shortcomings as Facebook continues to display false narratives in its own ‘Trending’ section.
Once stories end up in Trending, all bets are off — these things get shared like venereal disease at a community college, all under the presumed authority of the Facebook banner.
For the tech savvy, Facebook is merely a tool to curate what people are talking about and add your own voice to the conversation. For baby boomers and luddites, however, it’s an authoritative source in which to read the news. Never mind the fact that Facebook isn’t actually producing any of it.
Worse, the entire site operates in a sort of filter bubble or echo chamber. You’re friends with people who share your interests and, largely, your views. When a Megyn Kelly story appeared in the Trending section, it was quickly shared tens (or hundreds) of thousands across the network. The (now-proven-false) story alleged Kelly was a closet Clinton supporter and that Fox News was none-too-happy about it.
— Justin Green (@JGreenDC) August 29, 2016
It’s not the only time Facebook promoted a false story.
If you think that’s dangerous, how about these? The titles below are from websites that litter the web for the sole purpose of attracting cheap clicks through the spread of misinformation. Remember that confirmation bias thing? These are the stories that align with our deepest fears, preconceived notions, and communal beliefs.
They’re also completely false. In a world dominated by half-truths, these don’t even bother to go that far.
If you’ve been on the internet for more than a day — and to sites you weren’t led to by a social media link — you’ll see right through these. Most, however, won’t. While we take for granted our ability to see through bullshit, it’s a skill many aren’t blessed with. And nowhere is that more apparent than on Facebook.
There’s a side of correlation to go with this causation too.
Trump voters, according to polling data, tended to trend older and less educated than Clinton supporters. I hope I’m not being too controversial in saying younger people tend to be more computer savvy than older generations. And controversial or not, it should go without saying more education leads to a better filter for bullshit and misinformation.
So, in essence we have a social network known for doing little to stop the spread of misinformation and a userbase known for eating every word of it. It’s ironic, really, that this is typically the same segment of the population that blames media for their distrust of the political process yet digest news from alternative sites who are no more credible as an information source than Dr. Phil is as a doctor. Maybe they mean social media.
As Verge editor Casey Newton pointed out:
Everything I read for 18 months depicted Trump as a dangerous psychopath but tell me more about how this is the media’s fault
— Casey Newton (@CaseyNewton) November 9, 2016
He’s not wrong. The media vilified Trump to near-epic proportions, and rightfully so. This was the same candidate, after all, that mocked a disabled reporter, proposed a registration process for Muslims in the US (and a ban on future muslim immigration), disrespected a POW on television, and was accused of sexual assault by at least 19 women.
No matter where you stand on policy, it’s hard to argue that Trump warranted the media cutting him some slack.
For what it’s worth, Facebook today promised to start ‘doing more’ to stop the spread of misinformation. According to VP of product management at Facebook Adam Mosseri in a comment to TechCrunch:
In Trending we look at a variety of signals to help make sure the topics being shown are reflective of real-world events, and take additional steps to prevent false or misleading content from appearing.
Despite these efforts we understand there’s so much more we need to do, and that is why it’s important that we keep improving our ability to detect misinformation. We’re committed to continuing to work on this issue and improve the experiences on our platform.
If that sounds like rather vague language from the propaganda machine, I couldn’t fault you for thinking that.
In fairness, Facebook has a difficult task at hand, and one it as committed to tackling on multiple occasions.
That’s really where the good will ends. Since committing to stop the spread of misinformation, Facebook has made few public changes that back up the promise. Well, unless you can make the argument that firing the only human eyes available to verify stories on Trending was a step in the right direction — I wouldn’t.