PERSONALIZATION ALGORITHMS – LIMITING THE SCOPE OF DISCOVERY? HOW ALGORITHMS FORCE OUT SERENDIPITY

Authors

  • Mihail Vuzharov New Bulgarian University, Bulgaria

DOI:

https://doi.org/10.33919/dasc.18.1.2

Keywords:

Social media, algorithms, suggestion, discovery, encyclopedia

Abstract

The Digital has become ubiquitous and inevitable. Each day, fewer non-digitals remain, as others become digital immigrants, and finally being succeeded by digital natives. Billions of devices are now connected, as remote access and IoT-added-value have become commonplace. Cloud services have supplanted old-school digital products, personal data has become more valuable than most other resources, while our attention span has been shrinking, constantly besieged by millions of signals. It is now virtually impossible for anyone to exist outside of the Digital; it is virtually impossible not to rely on online services, not to have our data collected, not to have information tailored especially for our personal consumption, based on our unique digital footprints. UX Design paradigms have been shifting, moving us further from simple interaction, departing from on-screen interfaces, and simultaneously eliminating the need for a user’s encyclopedic competence (as per Eco) and even going past navigational competence (as per Bankov). Communication structures define communication outcomes. Communication structures literally shape our world, as Benedict Anderson would argue. While his analysis turns to the printing press as a causal mechanism for the formation of the nation states, one could argue that the algorithm- based structure of information delivery means a departure from the potential for serendipitous discovery, changing our systems of expectations, the way we think, and the way we perceive the world. If the entire system is based on our past, a mirror image of ourselves, this would mean that we are more likely to receive answers pertaining to a world that is entirely within our scope. The farther we depart from encyclopedic competence, and then from navigational competence (where we were at least able to browse into areas unknown), the farther we are moving from the unfamiliar. There is an event horizon, the information beyond which is completely outside our reach, and this event horizon is more and more tightly enclosing us. Essentially, our entire information inflow is based on a user model, derived by various algorithms, deep learning mechanisms and AI systems – a veritable black box, which, in turn, weaves a personalized and unique Dynamic Text for a very special Echian “model reader” – the “model user”. We will try to demonstrate how this relationship may lead to a limited outlook.

References

Anderson, Benedict. 2016. Imagined Communities: Reflections on the Origin and Spread of Nationalism. New York: Verso

Bankov, Kristian. 2010. Cultures of navigation versus cultures of erudition. Lexia Vol. I (September 2010), 103-123

Bankov, Kristian. 2017. Eco and the Google Search Innovations, in Thellefsen, T. and Sørensen, B. (eds) Umberto Eco in His Own Words, Berlin: De Gruyter Mouton, pp. 119-126. https://doi.org/10.1515/9781501507144-015

Bergstein, Brian. 2017. We Need More Alternatives to Facebook. (last accessed January 2018 from https://www.technologyreview.com/s/604082/we-need-more-alternatives-to-facebook)

Bogost, Ian. 2017. For Google, Everything is a Popularity Contest. (last accessed January 2018 from https://www.theatlantic.com/technology/archive/2017/06/for-google-everything-is-a-popularity-contest/531762)

Eco, Umberto. 1978. A Theory of Semiotics. Bloomington: Indiana University Press

Eco, Umberto. 1986. Semiotics and the Philosophy of Language. Bloomington: Indiana University Press

Eco, Umberto. 2002. Serendipities: Language and Lunacy. London: Phoenix.

Eco, Umberto. 1991. The Limits of Interpretation. Bloomington: Indiana University Press

Eco, Umberto. 1979. The Role of the Reader. Bloomington: Indiana University Press

Knight, Will. 2017. The Dark Secret at the Heart of AI. (last accessed January 2018 from https://www.technologyreview.com/s/604087/the-dark-secret-at-the-heart-of-ai/)

O’Neil, Cathy. 2016. The Dystopian Future of Price Discrimination. (last accessed January 2018 from https://www.bloomberg.com/view/articles/2017-03-16/the-dystopian-future-of-price-discrimination)

O’Neil, Cathy. 2017. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. New York: Broadway Books.

Orszag, Peter. 2017. People Lie, But Search Data Tell the Truth. (last accessed January 2018 from https://www.bloomberg.com/view/articles/2017-05-09/people-lie-but-search-data-tell-the-truth)

Pariser, Eli. The Filter Bubble: What the Internet is Hiding from You. London: The Penguin Press, 2011. Ebook.

Pisanty, Valentina. 2015. From the model reader to the limits of interpretation. Semiotica Vol. 206 (2015), 37-61 https://doi.org/10.1515/sem-2015-0014

Sieckenius de Souza, Clarisse. 2004. The Semiotic Engineering of Human-Computer Interaction. Cambridge: The MIT Press.

Srnicek, Nick. 2017. Platform Capitalism. Cambridge: Polity Press.

Van Dijck, Jose. 2013. The Culture of Connectivity: A Critical History of Social Media. New York: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199970773.001.0001

Viner, Katharine. 2016. How technology disrupted the truth. (last accessed January 2018 from https://www.theguardian.com/media/2016/jul/12/how-technology-disrupted-the-truth)

Wall Street Journal, The. 2016. Blue Feed, Red Feed: See Liberal and Conservative Facebook, Side by Side. (last accessed January 2018 from http://graphics.wsj.com/blue-feed-red-feed/)

Downloads

Published

2018-06-28

How to Cite

Vuzharov, M. (2018). PERSONALIZATION ALGORITHMS – LIMITING THE SCOPE OF DISCOVERY? HOW ALGORITHMS FORCE OUT SERENDIPITY. Digital Age in Semiotics & Communication, 1, 19–33. https://doi.org/10.33919/dasc.18.1.2