Prof. Jason Eshraghian Delivering an Educational Class at 2024 Embedded Systems Week

What do Transformers have to learn from Biological Spiking Neural Networks?

The brain is the perfect place to look for inspiration to develop more efficient neural networks. One of the main differences with modern deep learning is that the brain encodes and processes information as spikes rather than continuous, high-precision activations. This presentation will dive into how the open-source ecosystem has been used to develop brain-inspired neuromorphic accelerators, from our development of a Python training library for spiking neural networks (snnTorch, >100,000 downloads). We will explore how this is linked to our MatMul-free Language Model, providing insight into the next generation of large-scale, billion parameter models.