idw – Informationsdienst Wissenschaft

Nachrichten, Termine, Experten

Grafik: idw-Logo
idw-Abo

idw-News App:

AppStore

Google Play Store



Instance:
Share on: 
12/11/2025 09:00

Fairness in AI: Study Shows Central Role of Human Decision-Making

Falko Schoklitsch Kommunikation und Marketing
Technische Universität Graz

    In addition to helping in a practical way, recommendations based on AI should above all be fair. A new study by researchers at TU Graz and the Know Center shows how this can be achieved.

    AI-supported recommender systems should provide users with the best possible suggestions for their enquiries. These systems often have to serve different target groups and take other stakeholders into account who also influence the machine’s response: e.g. service providers, municipalities or tourism associations. So how can a fair and transparent recommendation be achieved here? Researchers from Graz University of Technology (TU Graz), the University of Graz and Know Center investigated this using a cycling tour app from the Graz-based start-up Cyclebee. They conducted research into how the diversity of human needs can be taken into account by AI. The study, which was awarded a Mind the Gap research prize for gender and diversity by TU Graz, was funded by the Styrian Future Fund.

    Impact on numerous groups

    “AI-supported recommender systems can have a major influence on purchasing decisions or the development of guest and visitor numbers,” says Bernhard Wieser from the Institute of Human-Centred Computing at TU Graz. “They provide information on services or places worth visiting and should ideally take individual needs into account. However, there is a risk that certain groups or aspects are under-represented.” In this context, an important finding of the research was that the targeted fairness is a multi-stakeholder problem, as not only end users play a role, but also numerous other actors.

    These include service providers such as hotels and restaurants along the routes and third parties such as municipalities and tourism organisations. And then there are stakeholders who don’t even come into contact with the app but are nevertheless affected, such as local residents who could feel the effects of overtourism. According to the study, reconciling all these stakeholders cannot be solved with technology alone. “If the app is to deliver the fairest possible results for everyone, the fairness goals must be clearly defined in advance. And that is a very human process that starts with deciding which target group to serve,” says Bernhard Wieser.

    Involving all actors in the design

    This target group decision influences the selection of the AI training data, its weighting and further steps in the algorithm design. In order to involve the other stakeholders as well, the researchers propose the use of participatory design, in which all actors are involved in order to harmonise their ideas as well as possible. “Ultimately, however, you have to decide in favour of something, so it’s up to the individual,” says Dominik Kowald from the Fair AI group at the Know Center research centre and the Department of Digital Humanities at the University of Graz. “Not everything can be optimised at the same time with an AI model. There is always a trade-off.”

    Ultimately, it is up to the developers to decide what this trade-off looks like, but according to the researchers, it is important for end users and providers that there is transparency. Users want to be able to adapt or influence the recommendations, and providers want to know the rules according to which routes have been set or providers ranked. “Our study results are intended to support software developers in their work in the form of design guidelines, and we also want to provide guidelines for political decision-makers,” says Bernhard Wieser. “It is important that we make recommender systems increasingly available to smaller, regional players thanks to technological developments. This would make it possible to develop fair solutions and thus create counter-models to multinational corporations, which would sustainably strengthen regional value creation.”


    Contact for scientific information:

    Bernhard WIESER
    Assoc.Prof. Mag.phil. Dr.phil.
    TU Graz | Institute of Human-Centred Computing
    Tel.: +43 316 873 30661
    bernhard.wieser@tugraz.at

    Dominik KOWALD
    Univ.-Prof. Dipl.-Ing. Dr.techn. BSc
    Know Center Research GmbH and University of Graz | Department of Digital Humanities
    Phone: +43 664 619 1718
    dkowald@know-center.at


    Original publication:

    Multistakeholder Fairness in Tourism: What Can Algorithms Learn from Tourism Management? https://doi.org/10.3389/fdata.2025.1632766


    Images

    The route suggested by AI depends heavily on human input during its creation.
    The route suggested by AI depends heavily on human input during its creation.

    Copyright: primipil - Adobe Stock


    Criteria of this press release:
    Business and commerce, Journalists, Scientists and scholars
    Information technology, Media and communication sciences
    transregional, national
    Research projects, Research results
    English


     

    The route suggested by AI depends heavily on human input during its creation.


    For download

    x

    Help

    Search / advanced search of the idw archives
    Combination of search terms

    You can combine search terms with and, or and/or not, e.g. Philo not logy.

    Brackets

    You can use brackets to separate combinations from each other, e.g. (Philo not logy) or (Psycho and logy).

    Phrases

    Coherent groups of words will be located as complete phrases if you put them into quotation marks, e.g. “Federal Republic of Germany”.

    Selection criteria

    You can also use the advanced search without entering search terms. It will then follow the criteria you have selected (e.g. country or subject area).

    If you have not selected any criteria in a given category, the entire category will be searched (e.g. all subject areas or all countries).