idw – Informationsdienst Wissenschaft

Nachrichten, Termine, Experten

Grafik: idw-Logo
Science Video Project
idw-Abo

idw-News App:

AppStore

Google Play Store



Instanz:
Teilen: 
28.12.2020 13:11

Artificial intelligence improves accuracy: Self-learning algorithms analyze medical imaging data

Dr. Andreas Battenberg Corporate Communications Center
Technische Universität München

    Imaging techniques enable a detailed look inside an organism. But interpreting the data is time-consuming and requires a great deal of experience. Artificial neural networks open up new possibilities: They require just seconds to interpret whole-body scans of mice and to segment and depict the organs in colors, instead of in various shades of gray. This facilitates the analysis considerably.

    How big is the liver? Does it change if medication is taken? Is the kidney inflamed? Is there a tumor in the brain and did metastases already develop? In order to answer such questions, bioscientists and doctors to date had to screen and interpret a wealth of data.

    "The analysis of three-dimensional imaging processes is very complicated," explains Oliver Schoppe. Together with an interdisciplinary research team, the TUM researcher has now developed self-learning algorithms to in future help analyze bioscientific image data.

    At the core of the AIMOS software – the abbreviation stands for AI-based Mouse Organ Segmentation – are artificial neural networks that, like the human brain, are capable of learning. "You used to have to tell computer programs exactly what you wanted them to do," says Schoppe. "Neural networks don't need such instructions:" It's sufficient to train them by presenting a problem and a solution multiple times. Gradually, the algorithms start to recognize the relevant patterns and are able to find the right solutions themselves."

    Training self-learning algorithms

    In the AIMOS project, the algorithms were trained with the help of images of mice. The objective was to assign the image points from the 3D whole-body scan to specific organs, such as stomach, kidneys, liver, spleen, or brain. Based on this assignment, the program can then show the exact position and shape.

    "We were lucky enough to have access to several hundred image of mice from a different research project, all of which had already been interpreted by two biologists," recalls Schoppe. The team also had access to fluorescence microscopic 3D scans from the Institute for Tissue Engineering and Regenerative Medicine at the Helmholtz Zentrum München.

    Through a special technique, the researchers were able to completely remove the dye from mice that were already deceased. The transparent bodies could be imaged with a microscope step by step and layer for layer. The distances between the measuring points were only six micrometers – which is equivalent to the size of a cell. Biologists had also localized the organs in these datasets.

    Artificial intelligence improves accuracy

    At the TranslaTUM the information techs presented the data to their new algorithms. And these learned faster than expected, Schoppe reports: "We only needed around ten whole-body scans before the software was able to successfully analyze the image data on its own – and within a matter of seconds. It takes a human hours to do this."

    The team then checked the reliability of the artificial intelligence with the help of 200 further whole-body scans of mice. "The result shows that self-learning algorithms are not only faster at analyzing biological image data than humans, but also more accurate," sums up Professor Bjoern Menze, head of the Image-Based Biomedical Modeling group at TranslaTUM at the Technical University of Munich.

    The intelligent software is to be used in the future in particular in basic research: "Images of mice are vital for, for example, investigating the effects of new medication before they are given to humans. Using self-learning algorithms to analyze image data in the future will save a lot of time in the future," emphasizes Menze.

    ###

    The research was conducted at TranslaTUM, the Center for Translational Cancer Research, at the Technical University of Munich. The institute is part of the TUM University Hospital rechts der Isar and specializes in transferring cancer research insights to practical patient services through interdisciplinary cooperation. When using the novel 3D microscopy, the scientists at the TUM worked closely with experts at the Helmholtz Zentrum München.

    The research project was funded by the German Federal Ministry of Education and Research (BMBF) within the scope of the Software Campus Initiative, the Deutsche Forschungsgemeinschaft (DFG) via the cluster of excellence Munich Cluster for Systems Neurology (SyNergy), as well as a research grant and by the TUM Institute for Advanced Study funded by the German excellence initiative and the European Union. The research was also funded by the Fritz Thyssen Foundation. NVIDIA supported the work of the GPU Grant Program.


    Wissenschaftliche Ansprechpartner:

    Prof. Dr. Bjoern H. Menze
    Technical University of Munich
    Department of Informatics
    Boltzmannstr. 3, 85748 Garching, Germany
    Tel.: +49 89 289 10930 – E-Mail: bjoern.menze@tum.de


    Originalpublikation:

    Oliver Schoppe, Chenchen Pan, Javier Coronel, Hongcheng Mai, Zhouyi Rong, Mihail Ivilinov Todorov, Annemarie Müskes, Fernando Navarro, Hongwei Li, Ali Ertürk, Bjoern H. Menze
    Deep learning-enabled multi-organ segmentation in whole-body mouse scans
    nature communications, 6.11.2020 – DOI: 10.1038/s41467-020-19449-7


    Weitere Informationen:

    https://www.nature.com/articles/s41467-020-19449-7#Abs1 Original publication
    https://www.tum.de/nc/en/about-tum/news/press-releases/details/36399/ Press release on TUM-website
    http://campar.in.tum.de/Chair/ResearchIBBM Website of the research group


    Bilder

    First author Oliver Schoppe at the Center for Translational Cancer Research (TranslaTUM) of the Technical University of Munich
    First author Oliver Schoppe at the Center for Translational Cancer Research (TranslaTUM) of the Tech ...
    Astrid Eckert / TUM

    Thanks to artificial intelligence, the AIMOS software is able to recognize bones and organs on three-dimensional grayscale images and segments them, which makes the subsequent evaluation considerably easier.
    Thanks to artificial intelligence, the AIMOS software is able to recognize bones and organs on three ...
    Astrid Eckert / TUM


    Merkmale dieser Pressemitteilung:
    Journalisten, Lehrer/Schüler, Studierende, Wissenschaftler, jedermann
    Biologie, Informationstechnik, Medizin, Physik / Astronomie
    überregional
    Forschungsergebnisse, Wissenschaftliche Publikationen
    Englisch


     

    First author Oliver Schoppe at the Center for Translational Cancer Research (TranslaTUM) of the Technical University of Munich


    Zum Download

    x

    Thanks to artificial intelligence, the AIMOS software is able to recognize bones and organs on three-dimensional grayscale images and segments them, which makes the subsequent evaluation considerably easier.


    Zum Download

    x

    Hilfe

    Die Suche / Erweiterte Suche im idw-Archiv
    Verknüpfungen

    Sie können Suchbegriffe mit und, oder und / oder nicht verknüpfen, z. B. Philo nicht logie.

    Klammern

    Verknüpfungen können Sie mit Klammern voneinander trennen, z. B. (Philo nicht logie) oder (Psycho und logie).

    Wortgruppen

    Zusammenhängende Worte werden als Wortgruppe gesucht, wenn Sie sie in Anführungsstriche setzen, z. B. „Bundesrepublik Deutschland“.

    Auswahlkriterien

    Die Erweiterte Suche können Sie auch nutzen, ohne Suchbegriffe einzugeben. Sie orientiert sich dann an den Kriterien, die Sie ausgewählt haben (z. B. nach dem Land oder dem Sachgebiet).

    Haben Sie in einer Kategorie kein Kriterium ausgewählt, wird die gesamte Kategorie durchsucht (z.B. alle Sachgebiete oder alle Länder).