idw - Informationsdienst
Wissenschaft
Erlangen, Germany: The Fraunhofer Institute for Integrated Circuits IIS has developed an AI chip for processing spiking neural networks (SNNs). The inference accelerator for spiking neural networks SENNA is inspired by the functioning of the brain, consists of artificial neurons, and can process electrical impulses (spikes) directly. Its speed, energy efficiency, and compact design make it possible to use SNNs directly where the data is generated: in devices at the edge.
SNNs consist of a network of artificial neurons that are connected to each other by synapses. Information is transmitted and processed in the form of electrical impulses. Spiking networks enable the next stage in the development of artificial intelligence: even faster, even more energy-efficient, even closer to the way the human brain processes data. Applying these advantages in practice calls for small, efficient hardware that replicates a structure of neurons and synapses. To this end, Fraunhofer IIS developed the neuromorphic SNN accelerator, SENNA, as part of the Fraunhofer SEC-Learn project.
Small and fast AI processor
SENNA is a neuromorphic chip for fast processing of low-dimensional time series data in AI applications. Its current version consists of 1,024 artificial neurons on a chip area of less than 11 mm2. With its short response time down to 20 nanoseconds, the chip ensures precise timing, especially in time-critical applications at the edge. Accordingly, its strengths really come to the fore in the real-time evaluation of event-based sensor data and in closed-loop control systems; for example, in the control of small electric motors with AI. SENNA can also be used to implement AI-optimized data transmission in communication systems. There, the AI processor can analyze signal streams and adapt transmission and reception procedures as required to improve the efficiency and performance of the transmission.
Pure energy efficiency through spiking neurons
One of the reasons why SNNs work so energy-efficiently is that the neurons are activated only sparingly and in response to specific events. With its spiking neurons, SENNA makes full use of this energy-saving advantage. Thanks to its fully parallel processing architecture, the artificial neurons precisely map the temporal behavior of SNNs. SENNA can also work directly with spike-based input and output signals via its integrated spike interfaces. In this way, it fits seamlessly into an event-based data stream. “With its novel architecture, SENNA resolves the trade-off between energy efficiency, processing speed, and versatility like no other edge AI processor. This makes it perfect for resource-limited applications that require extremely fast response times in the nanosecond range,” explains Michael Rothe, Group Manager Embedded AI at Fraunhofer IIS.
Scalable and easy to program
The current SENNA reference design is conceived for 22 nm manufacturing processes. This means that the SNN processor can be used as a chip in a wide variety of applications and can be implemented cost-effectively. Its design is scalable and can be adapted to specific applications, performance requirements, and special features of the target hardware prior to chip production. But even after the chip has been manufactured, SENNA retains maximum flexibility because it is fully programmable. The SNN model used can be changed again and again and retransferred to SENNA. To make it as easy as possible for developers to implement their AI models, Fraunhofer IIS also provides a comprehensive software development kit for SENNA.
https://www.iis.fraunhofer.de/en/pr/2025/pressrelease_ki_chip_senna.html
https://www.iis.fraunhofer.de/en/ff/kom/ai/snn/senna.html
SENNA, a neuromorphic SNN chip for ultra-fast and energy-efficient processing of low-dimensional tim ...
Paul Pulkert
Fraunhofer IIS / Paul Pulkert
Merkmale dieser Pressemitteilung:
Journalisten
Elektrotechnik, Informationstechnik
überregional
Forschungs- / Wissenstransfer
Englisch
Sie können Suchbegriffe mit und, oder und / oder nicht verknüpfen, z. B. Philo nicht logie.
Verknüpfungen können Sie mit Klammern voneinander trennen, z. B. (Philo nicht logie) oder (Psycho und logie).
Zusammenhängende Worte werden als Wortgruppe gesucht, wenn Sie sie in Anführungsstriche setzen, z. B. „Bundesrepublik Deutschland“.
Die Erweiterte Suche können Sie auch nutzen, ohne Suchbegriffe einzugeben. Sie orientiert sich dann an den Kriterien, die Sie ausgewählt haben (z. B. nach dem Land oder dem Sachgebiet).
Haben Sie in einer Kategorie kein Kriterium ausgewählt, wird die gesamte Kategorie durchsucht (z.B. alle Sachgebiete oder alle Länder).