Scientists led by physicists Prof. Wolfram Pernice, Prof. Martin Salinga, and computer specialist Prof. Benjamin Risse, all from the University of Münster (Germany), have developed an event-based architecture using photonic processors. This architecture, similar to the brain, enables the continuous adaptation of connections within the neural network.
Modern computer models, such as those used for complex AI applications, push traditional digital computer processes to their limits. New computing architectures that emulate the working principles of biological neural networks offer the promise of faster and more energy-efficient data processing. The researchers have developed an event-based architecture using photonic processors to transport and process data using light. Similar to the brain, this allows for the continuous adaptation of connections within the neural network, which forms the basis for learning processes. The study was conducted by a team from Collaborative Research Centre 1459 (“Intelligent Matter”), led by physicists Prof. Wolfram Pernice and Prof. Martin Salinga, and computer specialist Prof. Benjamin Risse, in collaboration with researchers from the Universities of Exeter and Oxford in the UK. The study has been published in the journal “Science Advances.”
In machine learning neural networks, artificial neurons activated by external excitatory signals and connected to other neurons are essential. These connections, called synapses, are similar to their biological counterparts. For their study, the Münster research team used a network consisting of approximately 8,400 optical neurons made of waveguide-coupled phase-change material. The team demonstrated that the connections between these neurons can become stronger or weaker (synaptic plasticity), new connections can be formed, and existing connections can be eliminated (structural plasticity). Unlike other studies, the synapses were not hardware elements but were coded based on the properties of the optical pulses, such as wavelength and intensity. This allowed for the integration of several thousand neurons on a single chip and their optical connections.
Compared to traditional electronic processors, light-based processors offer a significantly higher bandwidth, enabling complex computing tasks with lower energy consumption. This new approach is part of basic research. “Our goal is to develop an optical computing architecture that can rapidly and energy-efficiently handle AI applications in the long term,” says Frank Brückerhoff-Plückelmann, one of the lead authors.
Methodology: The non-volatile phase-change material can switch between an amorphous structure and a crystalline structure with a highly ordered atomic lattice. This feature allows permanent data storage even without an energy supply. The researchers tested the neural network’s performance by using an evolutionary algorithm to train it to distinguish between German and English texts. The recognition parameter they used was the number of vowels in the text.