DOI QR코드

DOI QR Code

Fake News in Social Media: Bad Algorithms or Biased Users?

  • Zimmer, Franziska ;
  • Scheibe, Katrin ;
  • Stock, Mechtild ;
  • Stock, Wolfgang G.
  • Received : 2019.04.18
  • Accepted : 2019.06.05
  • Published : 2019.06.30

Abstract

Although fake news has been present in human history at any time, nowadays, with social media, deceptive information has a stronger effect on society than before. This article answers two research questions, namely (1) Is the dissemination of fake news supported by machines through the automatic construction of filter bubbles, and (2) Are echo chambers of fake news manmade, and if yes, what are the information behavior patterns of those individuals reacting to fake news? We discuss the role of filter bubbles by analyzing social media's ranking and results' presentation algorithms. To understand the roles of individuals in the process of making and cultivating echo chambers, we empirically study the effects of fake news on the information behavior of the audience, while working with a case study, applying quantitative and qualitative content analysis of online comments and replies (on a blog and on Reddit). Indeed, we found hints on filter bubbles; however, they are fed by the users' information behavior and only amplify users' behavioral patterns. Reading fake news and eventually drafting a comment or a reply may be the result of users' selective exposure to information leading to a confirmation bias; i.e. users prefer news (including fake news) fitting their pre-existing opinions. However, it is not possible to explain all information behavior patterns following fake news with the theory of selective exposure, but with a variety of further individual cognitive structures, such as non-argumentative or off-topic behavior, denial, moral outrage, meta-comments, insults, satire, and creation of a new rumor.

Keywords

fake news;truth;information behavior;social media;filter bubble;echo chamber

References

  1. Zimmer, F., Scheibe, K., Stock, M., & Stock, W. G. (2019). Echo chambers and filter bubbles of fake news in social media. Man-made or produced by algorithms? In 8th Annual Arts, Humanities, Social Sciences & Education Conference (pp. 1-22). Honolulu: Hawaii University.
  2. Zimmer, F., Scheibe, K., & Stock, W. G. (2018). A model for information behavior research on social live streaming services (SLSSs). In Meiselwitz G. (Ed.), Social Computing and Social Media. Technologies and Analytics: 10th International Conference, Part II (pp. 429-448). Cham: Springer.
  3. Zuckerberg, M., Sanghvi, R., Bosworth, A., Cox, C., Sittig, A., Hughes, C., ... Corson, D. (2006). Dynamically providing a news feed about a user of a social network, U.S. Patent No. 7,669,123 B2. Washington, DC: U.S. Patent and Trademark Office.
  4. Torres, R. R., Gerhart, N., & Negahban, A. (2018). Combating fake news: An investigation of information verification behaviors on social networking sites. In Proceedings of the 51st Hawaii International Conference on System Sciences (pp. 3976-3985). Honolulu: HICSS.
  5. Tseng, E. (2015). Providing relevant notifications based on common interests in a social networking system, U.S. Patent No. 9,083,767. Washington, DC: U.S. Patent and Trademark Office.
  6. Volkova, S., & Jang, J. Y. (2018). Misleading or falsification: Inferring deceptive strategies and types in online news and social media. In Companion Proceedings of the Web Conference 2018 (pp. 575-583). Geneva: International World Wide Web Conferences Steering Committee.
  7. Vydiswaran, V. G. V., Zhai, C. X., Roth, D., & Pirolli, P. (2012). BiasTrust: Teaching biased users about controversial content. In Proceedings of the 21st ACM International Conference on Information and Knowledge Management (pp. 1905-1909). New York: ACM.
  8. Vydiswaran, V. G. V., Zhai, C. X., Roth, D., & Pirolli, P. (2015). Overcoming bias to learn about controversial topics. Journal of the Association for Information Science and Technology, 66(8), 1655-1672. https://doi.org/10.1002/asi.23274
  9. Walter, S., Bruggemann, M., & Engesser, S. (2018). Echo chambers of denial: Explaining user comments on climate change. Environmental Communication, 12(2), 204-217. https://doi.org/10.1080/17524032.2017.1394893
  10. Zimmer, F. (2019). Fake news. Unbelehrbar in der Echokammer? In W. Bredemeier (Ed.), Zukunft der Informationswissenschaft. Hat die Informationswissenschaft eine Zukunft? (pp. 393-399). Berlin: Simon Verlag fur Bibliothekswissen.
  11. Zimmer, F., Akyurek, H., Gelfart, D., Mariami, H., Scheibe, K., Stodden, R., ... Stock, W. G. (2018). An evaluation of the social news aggregator Reddit. In 5th European Conference on Social Media (pp. 364-373). Sonning Common: Academic Conferences and Publishing International.
  12. Zimmer, F., & Reich, A. (2018). What is truth? Fake news and their uncovering by the audience. In 5th European Conference on Social Media (pp. 374-381). Sonning Common: Academic Conferences and Publishing International.
  13. Seargeant, P., & Tagg, C. (2019). Social media and the future of open debate: A user-oriented approach to Facebook's filter bubble conundrum. Discourse, Context and Media, 27, 41-48. https://doi.org/10.1016/j.dcm.2018.03.005
  14. Sears, D. O., & Freedman, J. L. (1967). Selective exposure to information: A critical review. Public Opinion Quarterly, 31(2), 194-213. https://doi.org/10.1086/267513
  15. Self, W. (2016, November 28). Forget fake news on Facebook: The real filter bubble is you. New Statesman America. Retrieved Apr 18, 2019 from https://www.newstatesman.com/science-tech/social-media/2016/11/forget-fake-news-facebook-real-filter-bubble-you.
  16. Shin, J., Jian, L., Driscoll, K., & Bar, F. (2018). The diffusion of misinformation on social media: Temporal pattern, message, and source. Computers in Human Behavior, 83, 278-287. https://doi.org/10.1016/j.chb.2018.02.008
  17. Soergel, D. (1994). Indexing and the retrieval performance: The logical evidence. Journal of the American Society for Information Science, 45(8), 589-599. https://doi.org/10.1002/(SICI)1097-4571(199409)45:8<589::AID-ASI14>3.0.CO;2-E
  18. Spohr, D. (2017). Fake news and ideological polarization: Filter bubbles and selective exposure on social media. Business Information Review, 34(3), 150-160. https://doi.org/10.1177/0266382117722446
  19. Stock, M. (2016). Facebook: A source for microhistory? In K. Knautz & K.S. Baran (Eds.), Facets of Facebook: Use and users (pp. 210-240). Berlin: De Gruyter Saur.
  20. Stock, M. (2017). HCI research and history: Special interests groups on Facebook as historical sources. In HCI International 2017: Posters' Extended Abstracts. HCI 2017 (pp. 497-503). Cham: Springer.
  21. Stock, W. G., & Stock, M. (2013). Handbook of information science. Berlin: De Gruyter Saur.
  22. Tandoc Jr., E. C., Lim, Z. W., & Ling, R. (2018). Defining 'fake news.' A typology of scholarly definitions. Digital Journalism, 6(2), 137-153. https://doi.org/10.1080/21670811.2017.1360143
  23. Tornberg, P. (2018). Echo chambers and viral misinformation: Modeling fake news as complex contagion. PLoS ONE, 13(9), e0203958. https://doi.org/10.1371/journal.pone.0203958
  24. Mayring, P., & Fenzl, T. (2019). Qualitative inhaltsanalyse. In N. Baur & J. Blasius (Eds.), Handbuch Methoden der empirischen Sozialforschung (pp. 633-648). Wiesbaden: Springer Fachmedien.
  25. Mihailidis, P., & Viotty, S. (2017). Spreadable spectacle in digital culture: Civic expression, fake news, and the role of media literacy in 'post-fact' society. American Behavioral Scientist, 61(4), 441-454. https://doi.org/10.1177/0002764217701217
  26. Munoz-Torres, J. R. (2012). Truth and objectivity in journalism. Anatomy of an endless misunderstanding. Journalism Studies, 13(4), 566-582. https://doi.org/10.1080/1461670X.2012.662401
  27. Murungi, D. M., Yates, D. J., Purao, S., Yu, Y. J., & Zhan, R. (2019). Factual or believable? Negotiating the boundaries of confirmation bias in online news stories. In Proceedings of the 52nd Hawaii International Conference on System Sciences (pp. 5186-5195). Honolulu: HICSS.
  28. Nelson, J. L., & Webster, J. G. (2017). The myth of partisan selective exposure: A portrait of the online news audience. Social Media + Society, 3(3). doi:10.1177/2056305117729314. https://doi.org/10.1177/2056305117729314
  29. Neurath, O. (1931). Soziologie im Physikalismus. Erkenntnis, 2(1), 393-431. https://doi.org/10.1007/BF02028171
  30. Oxford Dictionaries (2016). Word of the year 2016 is post-truth. Retrieved Apr 18, 2019 from https://en.oxforddictionaries.com/word-of-the-year/word-of-the-year-2016.
  31. Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. London: Viking.
  32. Pawlow, T. (1973). Die Widerspiegelungstheorie. Berlin: Deutscher Verlag der Wissenschaften.
  33. Quattrociocchi, W. (2017). Inside the echo chamber. Scientific American, 316(4), 60-63.
  34. Russell, B. (1971). Problems in Philosophy. Oxford: Oxford University Press.
  35. Saracevic, T. (1975). Relevance: A review of and a framework for the thinking on the notion in information science. Journal of the American Society for Information Science, 26(6), 321-343. https://doi.org/10.1002/asi.4630260604
  36. Kaplan, A. M., & Haenlein, M. (2010). Users of the world unite! The challenges and opportunities of social media. Business Horizons, 53(1), 59-68. https://doi.org/10.1016/j.bushor.2009.09.003
  37. Kearney, M. W. (2019). Analyzing change in network polarization. New Media & Society, 21(6), 1380-1402. https://doi.org/10.1177/1461444818822813
  38. Kim, A., & Dennis, A. R. (2018). Says who? How news presentation format influences perceived believability and the engagement level of social media users. In Proceedings of the 51st Hawaii International Conference on System Sciences (pp. 3955-3965). Washington, DC: IEEE Computer Science.
  39. Krippendorff, K. (2018). Content analysis: An introduction to its methodology (4th ed.). Los Angeles: Sage.
  40. Kuhlen, R. (1995). Informationsmarkt. Chancen und Risiken der Kommerzialisierung von Wissen. Konstanz Germany: UVK.
  41. Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the 'post-truth' era. Journal of Applied Research in Memory and Cognition, 6(4), 353-369. https://doi.org/10.1016/j.jarmac.2017.07.008
  42. Liao, Q.V., & Fu, W.T. (2013). Beyond the filter bubble: Interactive effects of perceived threat and topic involvement on selective exposure to information. In Proceedings of the SIGHCI Conference on Human Factors in Computing Systems (pp. 2359-2368). New York: ACM.
  43. Linde, F., & Stock, W. G. (2011). Information markets: A strategic guide for the i-commerce. Berlin: De Gruyter Saur.
  44. Mans, S. (2016, November 13). In the filter bubble: How algorithms customize our access to information [Web blog post]. Retrieved Apr 18, 2019 from https://mastersofmedia.hum.uva.nl/blog/2016/11/13/in-the-filter-bubble/.
  45. Gilbert, E., Bergstrom, T., & Karahalios, K. (2009). Blogs are echo chambers: Blogs are echo chambers. In Proceedings of the 42nd Hawaii International Conference on System Sciences (pp. 1-10). Washington, DC: IEEE Computer Science.
  46. Gust von Loh, S., & Stock, W. G. (2013). Informationskompetenz als Schulfach? In S. Gust von Loh & W. G. Stock (Eds.), Informationskompetenz in der Schule. Ein informationswissenschaftlicher Ansatz (pp. 1-20). Berlin: De Gruyter Saur.
  47. Habermas, J. (1972). Knowledge and human interests. Boston: Beacon Press.
  48. Habermas, J. (2006). Political communication in media society. Does democracy still enjoy an epistemic dimension? The impact of normative theory on empirical research. Communication Theory, 16(4), 411-426. https://doi.org/10.1111/j.1468-2885.2006.00280.x
  49. Haim, M., Graefe, A., & Brosius, H. B. (2018). Burst of the filter bubble? Effects of personalization on the diversity of Google News. Digital Journalism, 6(3), 330-343. https://doi.org/10.1080/21670811.2017.1338145
  50. Hays, P. A. (2004). Case study research. In K. deMarrais & S. D. Lapan (Eds.), Foundations for research (pp. 217-234). Mahwah: Lawrence Erlbaum.
  51. Hern, A. (2017, May 22). How social media filter bubbles and algorithms influence the election. The Guardian, Retrieved Apr 18, 2019 from https://www.theguardian.com/technology/2017/may/22/social-media-election-facebook-filter-bubbles.
  52. Holmes, R. (2016, December 8). The problem isn't fake news, it's bad algorithms: Here's why. Observer. Retrieved Apr 18, 2019 from https://observer.com/2016/12/the-problem-isnt-fake-news-its-bad-algorithms-heres-why.
  53. Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277-1288. https://doi.org/10.1177/1049732305276687
  54. Hyman, H. H., & Sheatsley, P. B. (1947). Some reasons why information campaigns fail. Public Opinion Quarterly, 11(3), 412-423. https://doi.org/10.1093/poq/11.3.412
  55. Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., ... Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences of the United States of America, 113(3), 554-559. https://doi.org/10.1073/pnas.1517441113
  56. DiFranzo, D., & Gloria-Garcia, K. (2017). Filter bubbles and fake news. XRDS: Crossroads, ACM Magazine for Students, 23(3), 32-35.
  57. Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729-745. https://doi.org/10.1080/1369118X.2018.1428656
  58. Elo, S., & Kyngas, H. (2008). The qualitative content analysis process. Journal of Advanced Nursing, 62(1), 107-115. https://doi.org/10.1111/j.1365-2648.2007.04569.x
  59. Fischer, P., Lea, S., Kastenmuller, A., Greitemeyer, T., Fischer, J., & Frey, D. (2011). The process of selective exposure: Why confirmatory information search weakens over time. Organizational Behavior and Human Decision Making, 114(1), 37-48. https://doi.org/10.1016/j.obhdp.2010.09.001
  60. Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 80(S1), 298-320. https://doi.org/10.1093/poq/nfw006
  61. Floridi, L. (2005). Is information meaningful data? Philosophy and Phenomenological Research, 70(2), 351-370. https://doi.org/10.1111/j.1933-1592.2005.tb00531.x
  62. Flyvbjerg, B. (2006). Five misunderstandings about case-study research. Qualitative Inquiry, 12(2), 219-245. https://doi.org/10.1177/1077800405284363
  63. Frey, D. (1986). Recent research on selective exposure to information. Advances in Experimental Social Psychology, 19, 41-80.
  64. Garrett, R. K. (2009). Echo chambers online? Politically motivated selective exposure among Internet new users. Journal of Computer-Mediated Communication, 14(2), 265-285. https://doi.org/10.1111/j.1083-6101.2009.01440.x
  65. Brentano, F. (1930). Wahrheit und Evidenz. Leipzig: Meiner.
  66. Bruns, A. (2017). Echo chamber? What echo chamber? Reviewing the evidence. In 6th Biennal Future of Journalism Conference (FOJ17), 14-15 September, Cardiff, UK. Retrieved Apr 18, 2019 from https://eprints.qut.edu.au/113937/.
  67. Bruns, A. (2019). Are filter bubbles real? Cambridge: Polity Press.
  68. Buckland, M. K. (1991). Information and information systems. New York: Praeger.
  69. Case, D. O., & Given, L.M. (2018). Looking for information: A survey of research on information seeking, needs, and behavior (4th ed.). Bingley: Emerald.
  70. Chisholm, R. M. (1977). Theory of knowledge. Englewood Cliffs: Prentice-Hall.
  71. Connaway, L. S., Julien, H., Seadle, M., & Kasprak, A. (2017). Digital literacy in the era of fake news: Key roles for information professionals. Proceedings of the Association for Information Science and Technology, 54(1), 554-555.
  72. Conroy, N. J., Rubin, V. L., & Chen, Y. (2015). Automatic deception detection: Methods for finding fake news. Proceedings of the Association for Information Science and Technology, 52(1), 1-4.
  73. David, M. (1994). Correspondence and disquotation. An essay on the nature of truth. Oxford: Oxford University Press.
  74. Davies, H. C. (2018). Redefining filter bubbles as (escapable) socio-technical recursion. Sociological Research Online, 23(3), 637-654. https://doi.org/10.1177/1360780418763824
  75. Berelson, B. (1952). Content analysis in communications research. Glencoe: Free Press.
  76. Bessi, A., Zollo, F., Del Vicario, M., Scala, A., Caldarelli, G., & Quattrociocchi, W. (2015). Trend of narratives in the age of misinformation. PLoS ONE, 10(8), e0134641. https://doi.org/10.1371/journal.pone.0134641
  77. Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-236. https://doi.org/10.1257/jep.31.2.211
  78. Bakshy, E., Messing, S., & Adamic, L. A. (2015). Political science: Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130-1132. https://doi.org/10.1126/science.aaa1160
  79. Bastos, M., Mercea, D., & Baronchelli, A. (2018). The geographic embedding of online echo chambers: Evidence from the Brexit campaign. PLoS One, 13(11), e0206841. https://doi.org/10.1371/journal.pone.0206841
  80. Batchelor, O. (2017). Getting out the truth: The role of libraries in the fight against fake news. Reference Services Review, 45(2), 143-148. https://doi.org/10.1108/RSR-03-2017-0006