idw – Informationsdienst Wissenschaft

Nachrichten, Termine, Experten

Grafik: idw-Logo
Grafik: idw-Logo

idw - Informationsdienst
Wissenschaft

Science Video Project
idw-Abo

idw-News App:

AppStore

Google Play Store



Instance:
Share on: 
07/25/2024 11:00

Little Trust in Dr. ChatGPT

Gunnar Bartsch Presse- und Öffentlichkeitsarbeit
Julius-Maximilians-Universität Würzburg

    People trust medical advice less if they suspect that an artificial intelligence is involved in its creation. This is the key finding of a study by psychologists from the University of Würzburg.

    People used to ask Dr. Google if they wanted to know whether their symptoms indicate a mild stomach upset or terminal cancer; today, they are increasingly turning to ChatGPT. As a result, doctors are complaining about patients who come into their consulting rooms with ready-made diagnoses from the internet and are difficult to convince that they are not seriously ill.

    In fact, trust in the medical competence of artificial intelligence (AI) is nowhere near as pronounced as it seems. At least that is the result of a new study that has now been published in the journal Nature Medicine.

    A Pronounced Distrust toward AI

    The study shows that people rate medical advice as less reliable and empathetic whenever an AI was believed to be involved. This was the case even when the study participants could assume that a doctor had made these recommendations with the help of an AI. Consequently, respondents were also less willing to follow AI-supported recommendations compared to advice based solely on medical expertise of human doctors.

    Moritz Reis and Professor Wilfried Kunde from the Chair of Psychology III at Julius-Maximilians-Universität (JMU) are responsible for this study, which was conducted in collaboration with Florian Reis from Pfizer Pharma GmbH.

    "The setting of our study is based on a digital health platform where information on medical issues can be obtained - in other words, a setting that will become increasingly relevant with increasing digitalization," the authors describe their approach.

    No Differences in Comprehensibility

    As part of the study, more than 2,000 participants received identical medical advice and were asked to evaluate it for reliability, comprehensibility and empathy. The only difference: while one group was told that this advice came from a doctor, the second group was told that an AI-supported chatbot was responsible. The third group was led to believe that a doctor had made the recommendation with the help of an AI.

    The results are clear: people trust medical recommendations less if they suspect that AI is involved. This also applies if they believe that medical staff contributed to advice generation. Advice labelled as human-generated also scored better than the two AI variants in the "empathy" category. Only in terms of comprehensibility there were hardly any differences between the three groups. Apparently, people have no reservations about the technology from this point of view.

    Trust is Important for Successful Treatment

    "This is an important finding, as trust in medical diagnoses and therapy recommendations is known to be a very important factor for the success of the treatment," the authors of the study say. These findings are particularly important against the backdrop of a possible reduction in bureaucracy and relief for doctors' day-to-day work through cooperation with AI. In their opinion, the study therefore represents a starting point for detailed research into the conditions under which AI can be used in diagnostics and therapy without jeopardizing patients’ trust and cooperation.


    Contact for scientific information:

    Moritz Reis, Chair of Psychology III, T: +49 931 31-81629, moritz.reis@uni-wuerzburg.de


    Original publication:

    Influence of believed AI involvement on the perception of digital medical advice. Moritz Reis, Florian Reis, Wilfried Kunde. DOI: 10.1038/s41591-024-03180-7


    Images

    Criteria of this press release:
    Journalists, Scientists and scholars
    Information technology, Medicine, Psychology
    transregional, national
    Research results
    English


     

    Help

    Search / advanced search of the idw archives
    Combination of search terms

    You can combine search terms with and, or and/or not, e.g. Philo not logy.

    Brackets

    You can use brackets to separate combinations from each other, e.g. (Philo not logy) or (Psycho and logy).

    Phrases

    Coherent groups of words will be located as complete phrases if you put them into quotation marks, e.g. “Federal Republic of Germany”.

    Selection criteria

    You can also use the advanced search without entering search terms. It will then follow the criteria you have selected (e.g. country or subject area).

    If you have not selected any criteria in a given category, the entire category will be searched (e.g. all subject areas or all countries).