We live in a world where the news we read is more likely to come from social media than a traditional news source. And whilst there are many positives to this, the unintentional consequence of the "freedom" this gives us is that we become less aware of opinions that differ from our own.
Our preferences determine what we see, defining and filtering what ends up on our news feed. The more likes we put on a certain article, the more likely we'll see similar articles as a suggestion. The less clicks an opinion gets, the more likely that opinion disappears into a social abyss. Meanwhile misinformation wrapped in clickbait has the potential to be socially legitimised.
"Facebook's algorithm is central to how news & information is consumed in the world today, and no historian will write about 2016 without it." -@zeynep
Newsfeed and search algorithms have progressed significantly over the years, but 2016 may have provided a reality check on using this technology responsibly - if the tech has the potential to shape a user's opinion, make sure the tech doesn't value his or her existing opinion in isolation, allowing that user's opinion to be challenged by ideas from legitimate sources.
Beyond literal fake news spread via Facebook’s click-hungry platform, the wider issue is the filter bubble its preference-fed Newsfeed algorithms use to encircle individuals as they work to spoonfeed them more clicks — and thus keep users spinning inside concentric circles of opinion, unexposed to alternative points of view.