In the aftermath of the U.S. Presidential election, Mark Zuckerberg took the stage at Techonomy16 to address concerns that Facebook didnât do enough to stop the proliferation of fake news on News Feed.
Zuckerberg insisted that the company can always do more to improve the quality of the News Feed experience, but that Facebook could not have influenced the outcome of theÂ election.
âPersonally, I believe the idea that fake news on Facebook could have influenced the election, of which there isÂ a very small amount, is a crazy idea,âÂ Zuckerberg said.
He argued essentially that the media failed to learn its lesson about brushing off Trump supporters as being too dumb to decide. It was just as likely for News Feed to highlightÂ fake news about Clinton, but the media remains steadfast in pointing towards Trump supporters, making the implicit assumption that they only had Facebook to makeÂ their decisions.
âPeople are smart and they understand whatâs important to them,â notedÂ Zuckerberg.
Instead,Â heÂ assertedÂ that the problem isnât the accessibility of facts, but rather content engagement. He noted that Trumpâs posts got more engagement than Clintonâs on Facebook. Facebook research shows that nearly everyone on the platform isÂ connected with at-least someoneÂ that shares opposing ideological beliefs. The real question isÂ how to influence the way people react when they see a post they disagree withÂ and stop them from brushing it under the rug.
To get there, Facebook is making efforts to involve humans more deeply in theÂ creation of theÂ ranking algorithmsÂ the companyÂ uses for content.Â News Feed now has a human quality panel that is usedÂ to hone in rankings. Humans are given stories and asked to rank them to get a better idea of what makes a particular story fulfilling for the end user.
Zuckerberg had previously only addressed the election in a FacebookÂ post featuring a photograph of his daughter Max. HeÂ noted at that time that, âWe are all blessed to have the ability to make the world better, and we have the responsibility to do it,â but didnât elaborate on what that meant specifically for him and his company.
Adam Mosseri, VP of Product Management for NewsFeed, echoed much of what Zuckerberg saidÂ earlier today in a statement to TechCrunch, though his brief comments wereÂ notably less skeptical of the importance of removing propaganda.
âWe understand thereâs so much more we need to do, and that is why itâs important that we keep improving our ability to detect misinformation,âÂ Mosseri noted.
Despite all of the global concernÂ about Trumpâs win,Â Zuckerberg did take a moment to makeÂ it clear that he doesnât believe any single person can fundamentally alter the arc of technological innovation.
As soon as I get ZuckerbergâsÂ comments about the spread of misinformation on the social network typed up, I will post them below.
Featured Image: Paul Sakuma Photography