‘So what was the name of this classic Tarantino film that's so confusing?' we discuss with a friend. And the next time we log on to our streaming service, "Pulp Fiction" is at the top of the list. Sometimes it is downright creepy what artificial intelligence knows about us. Scientists at the University of Duisburg-Essen (UDE) have been studying when and why we feel this way. Their conclusion: (Too little) transparency is crucial.
Recommender systems are ubiquitous online. Usually unnoticed by us, they are supposed to quickly find the individually suitable product out of millions of possibilities. Modern algorithms and artificial intelligence analyze our hotel bookings, record the ratings we give to products and services online, listen to our music streaming and remember the recipes we often call up.
But nearly all of us are familiar with these recommendations, that fit almost too well and feel as if the computer read our thoughts. Those suggestions that somehow feel creepy. PhD student Helma Torkamaan from the working group "Interactive Systems" experienced this herself, although she knows the technology and uses it herself: "I talked to a colleague about a publication. The topic was new to me, but it sounded interesting. Shortly afterwards, this article appeared at the top of my recommendations.”
Together with her colleague Catalin Barbu, they conducted a study in which 171 participants were shown recommendations on films, hotels and medical topics. The results were evaluated in each case: How well does the suggestion fit? Is the motivation clear? What do we feel when we see a creepy recommendation? What would be the consequences of that?
They found that people feel particularly spied on when they don't see a connection between an excellent recommendation and their previous online behaviour: In other words, it gets creepy when artificial intelligence hits a mark and we don't know why. If, on the other hand, media or platforms exchange information in a comprehensible way, this is more neutrally accepted.
"We therefore recommend giving people control over what is personalized and how it is done", Barbu concludes. Otherwise, trust in the recommender system may decline or the user may even reject the proposed products.
Helma Torkamaan, Interactive Systems, +49 203/37 9-2276, helma.torkamaan@uni-due.de
Catalin-Mihai Barbu, Interactive Systems, +49 203/37 9 -1730, catalin.barbu@uni-due.de
Helma Torkamaan, Catalin-Mihai Barbu, and Jürgen Ziegler. 2019. How Can They Know That? A Study of Factors Affecting the Creepiness of Recommendations. In Thirteenth ACM Conference on Recommender Systems (RecSys ’19), September 16–20, 2019, Copenhagen, Denmark. ACM, New York, NY, USA, 5 pages.
It feels creepy, if the computer knows too much about us.
pixabay.com
None
Criteria of this press release:
Business and commerce, Journalists, Scientists and scholars, all interested persons
Information technology, Media and communication sciences, Philosophy / ethics, Psychology, Social studies
transregional, national
Miscellaneous scientific news/publications, Research results
English
You can combine search terms with and, or and/or not, e.g. Philo not logy.
You can use brackets to separate combinations from each other, e.g. (Philo not logy) or (Psycho and logy).
Coherent groups of words will be located as complete phrases if you put them into quotation marks, e.g. “Federal Republic of Germany”.
You can also use the advanced search without entering search terms. It will then follow the criteria you have selected (e.g. country or subject area).
If you have not selected any criteria in a given category, the entire category will be searched (e.g. all subject areas or all countries).