Donald Trump’s surprising victory has left the world stunned. Americans are no strangers to embarrassing political nutbaggery. But when it comes to president-elect Donald Trump, we may have outdone ourselves.
Trump is seen as a woman-hating reality TV star whose campaign was mostly focused on his lust for ethnic cleansing, a native son of a country that worships selfishness above empathy, corporate interests over justice and notoriety over prestige.
The global reaction to Trump’s bewildering election has been a sense of disgust.
French author Marie-Cecile Naves wrote: “Trump represents the America we love to hate…He is our negative mirror image, a man we see as brutal, who worships money and lacks culture — someone who lets us feel a bit superior about being European.”
And now that Donald Trump is officially going to be the next president of the United States, people around the world are looking around and saying, “What happened?”
Many of them are answering that in a word, “Facebook.”
Facebook’s news feed algorithm is tuned to try and show you stuff you like. It doesn’t distinguish between fact and fiction. It’s been called the “filter bubble” by author Eli Pariser.
In his 2011 best seller, “The Filter Bubble: What the Internet Is Hiding from You,” Pariser warns about the effects the filter bubble could have on democracy.
“Ultimately, democracy works only if we citizens are capable of thinking beyond our narrow self-interest. But to do so, we need a shared view of the world we cohabit. We need to come into contact with other people’s lives and needs and desires. The filter bubble pushes us in the opposite direction – it creates the impression that our narrow self-interest is all that exists. And while this is great for getting people to shop online, it’s not great for getting people to make better decisions together.”
While there are certainly plenty of news sources that any curious individual could read to become informed about the issues, the truth is, more Americans are increasingly leaving the curation to Facebook.
A recent Pew Study found that a majority of U.S. adults – 63% – use Facebook as a source of news on events and issues beyond pictures posted with friends and family. The problem is that Facebook users aren’t always good at distinguishing legitimate news sources from satire, propaganda, or just plain false information. And if bad information goes viral, it can negatively influence the public’s opinion.
And yet Facebook has not been willing to accept responsibility for this kind of role, refusing to believe it’s acting like a news organization, a media company. CEO Mark Zuckerberg said in August:
“We’re a technology company. We’re not a media company. When you think about a media company, you know, people are producing content, people are editing content, and that’s not us,” he said. “We exist to give you the tools to curate and have the experience that you want, to connect with the people and businesses and institutions in the world that you want.”
But, waking up to the results of this election has made some people rethink the role Facebook, and not in a good way.
A recent analysis on fake, inflammatory news on both right and left wing political sites on Facebook found that 58% of right-wing news are fake, while 19% of left-leaning sites are fake.
The spreading of false information during the election cycle was so bad that President Barack Obama called Facebook a “dust cloud of nonsense.”
“People if they just repeat attacks enough and outright lies over and over again, as long as it’s on Facebook and people can see it, as long as its on social media, people start believing it,” the Obama said.
Now that Facebook is such an important part of the news cycle, its vetting process needs to mature. It should evaluate the person who is sharing a piece of content on Facebook, weigh the quality of the link being shared, and then determine how far a friend’s status message should really spread.
If Facebook wants to be a platform where billions of people regularly find and share news, then it needs to accept some of the responsibility that comes with that power. That means coming up with some guidelines to help spread information responsibly.
Coming up with this sort of process isn’t censorship. It’s just being responsible.
But Mark Zuckerberg doesn’t seem to get that.
“Personally, I think the idea that fake news on Facebook influenced the election in any way is a pretty crazy idea,” Zuckerberg said on Thursday night.
That seems a bit tone deaf.
It’s not hard to see why Facebook is reluctant to do this. The internet was built on the legal foundation that online companies are not liable for third-party content displayed on their sites.
But it’s possible to be informative and responsible. Just take a look at Google. The tech giant has built an algorithm that prioritizes the quality and relevance of an article. Anyone can write anything online, but not just any piece of content will show up in the first few pages of a Google search result.
Google takes its responsibility to surface the right information seriously — in part because its business depends on it — and by and large, people trust that the top results on Google will be legitimate.