Neurobiologists at the German Primate Center developed a model that for the first time can completely represent the neuronal processes from seeing to grasping an object
Every day we effortlessly make countless grasping movements. We take a key in our hand, open the front door by operating the door handle, then pull it closed from the outside and lock it with the key. What is a natural matter for us is based on a complex interaction of our eyes, different regions of the brain and ultimately our muscles in the arm and hand. Neuroscientists at the German Primate Center (DPZ) - Leibniz Institute for Primate Research in Göttingen have succeeded for the first time in developing a model that can seamlessly represent the entire planning of movement from seeing an object to grasping it. Comprehensive neural and motor data from grasping experiments with two rhesus monkeys provided decisive results for the development of the model, which is an artificial neural network that, by feeding it with images showing certain objects, is able to simulate processes and interactions in the brain for the processing of this information. The neuronal data from the artificial network model were able to explain the complex biological data from the animal experiments and thus prove the validity of the functional model. This could be used in the long term for the development of better neuroprostheses, for example, to bridge the damaged nerve connection between brain and extremities in paraplegia and thus restore the transmission of movement commands from the brain to arms and legs (PNAS).
Rhesus monkeys, like humans, have a highly developed nervous and visual system as well as dexterous hand motor control. For this reason, they are particularly well suited for research into grasping movements. From previous studies in rhesus monkeys it is known that the interaction of three brain areas is responsible for grasping a targeted object. Until now, however, there has been no detailed model at the neural level to represent the entire process from the processing of visual information to the control of arm and hand muscles for grasping that object.
In order to develop such a model, two male rhesus monkeys were trained to grasp 42 objects of different shapes and sizes, presented to them in random order. The monkeys wore a data glove that continuously recorded the movements of arm, hand and fingers. The experiment was performed by first briefly illuminating the object to be grasped while the monkeys looked at a red dot below the respective object and performed the grasping movement with a short delay after a blinking signal. These conditions provide information about the time at which the different brain areas are active in order to generate the grasping movement and the associated muscle activations based on the visual signals.
In the next step, images of the 42 objects, taken from the perspective of the monkeys, were fed into an artificial neural network in the computer, whose functionality was mimicking the biological processes in the brain. The network model consisted of three interconnected stages, corresponding to the three cortical brain areas of the monkeys, and provided meaningful insights into the dynamics of the brain networks. After appropriate training with the behavioral data of the monkeys, the network was able to precisely reflect the grasping movements of the rhesus monkeys. It was able to process images of recognizable objects and could reproduce the muscle dynamics required to grasp the objects accurately.
The results obtained using the artificial network model were then compared with the biological data from the monkey experiment. It turned out that the neural dynamics of the model were highly consistent with the neural dynamics of the cortical brain areas of the monkeys. "This artificial model describes for the first time in a biologically realistic way the neuronal processing from seeing an object for object recognition, to action planning and hand muscle control during grasping", says Hansjörg Scherberger, head of the Neurobiology Laboratory at the DPZ, and he adds: "This model contributes to a better understanding of the neuronal processes in the brain and in the long term could be useful for the development of more efficient neuroprostheses."
Prof. Dr. Hansjörg Scherberger
Phone: +49 (0)551 3851-494
Email: HScherberger@dpz.eu
Michaels JA, Schaffelhofer S, Agudelo-Toro A, Scherberger H (2020): A goal-driven modular neural network predicts parietofrontal neural dynamics during grasping.
PNAS. DOI: https://doi.org/10.1073/pnas.2005087117
http://medien.dpz.eu/pinaccess/showpin.do?pinCode=B9ZQABtiGmFm - Printable Pictures
https://www.dpz.eu/en/news/press-releases/single-view/news/ein-objekt-greifen-mo... - Press Release on DPZ-Website
Prof. Dr. Hansjörg Scherberger, Head of the Neurobiology Laboratory at the German Primate Center (DP ...
Karin Tilch
German Primate Center
A rhesus macaque (Macaca mulatta) wearing a data glove for detailed hand and arm tracking.
Ricarda Lbik
German Primate Center
Merkmale dieser Pressemitteilung:
Journalisten, Studierende, Wissenschaftler
Biologie, Medizin
überregional
Forschungsergebnisse, Wissenschaftliche Publikationen
Englisch
Prof. Dr. Hansjörg Scherberger, Head of the Neurobiology Laboratory at the German Primate Center (DP ...
Karin Tilch
German Primate Center
A rhesus macaque (Macaca mulatta) wearing a data glove for detailed hand and arm tracking.
Ricarda Lbik
German Primate Center
Sie können Suchbegriffe mit und, oder und / oder nicht verknüpfen, z. B. Philo nicht logie.
Verknüpfungen können Sie mit Klammern voneinander trennen, z. B. (Philo nicht logie) oder (Psycho und logie).
Zusammenhängende Worte werden als Wortgruppe gesucht, wenn Sie sie in Anführungsstriche setzen, z. B. „Bundesrepublik Deutschland“.
Die Erweiterte Suche können Sie auch nutzen, ohne Suchbegriffe einzugeben. Sie orientiert sich dann an den Kriterien, die Sie ausgewählt haben (z. B. nach dem Land oder dem Sachgebiet).
Haben Sie in einer Kategorie kein Kriterium ausgewählt, wird die gesamte Kategorie durchsucht (z.B. alle Sachgebiete oder alle Länder).