|Titel / Titel:||Fake news in social media: Bad algorithms or biased users?|
|Author / Autor||Zimmer, F., Scheibe, K., Stock, M., & Stock, W. G.|
|Source / Quelle||Journal of Information Science Theory and Practice, 7(2), 40-53.|
|Language / Sprache||English / Englisch|
Although fake news has been present in human history at any time, nowadays, with social media, deceptive information has a stronger effect on society than before. This article answers two research questions, namely (1) Is the dissemination of fake news supported by machines through the automatic construction of filter bubbles, and (2) Are echo chambers of fake news man-made, and if yes, what are the information behavior patterns of those individuals reacting to fake news? We discuss the role of filter bubbles by analyzing social media’s ranking and results’ presentation algorithms. To understand the roles of individuals in the process of making and cultivating echo chambers, we empirically study the effects of fake news on the information behavior of the audience, while working with a case study, applying quantitative and qualitative content analysis of online comments and replies (on a blog and on Reddit). Indeed, we found hints on filter bubbles; however, they are fed by the users’ information behavior and only amplify users’ behavioral patterns. Reading fake news and eventually drafting a comment or a reply may be the result of users’ selective exposure to information leading to a confirmation bias; i.e. users prefer news (including fake news) fitting their pre-existing opinions. However, it is not possible to explain all information behavior patterns following fake news with the theory of selective exposure, but with a variety of further individual cognitive structures, such as non-argumentative or off-topic behavior, denial, moral outrage, meta-comments, insults, satire, and creation of a new rumor.
Keywords: fake news, truth, information behavior, social media, filter bubble, echo chamber