We submitted this extensive (and opinionated) guide to training spiking neural networks to the Proceedings of the IEEE 18 months ago. During this time, the preprint has reached 100+ citations, snnTorch has cracked 80,000 downloads, and it has helped numerous people enter the field of neuromorphic computing… and much of the content that was true 18 months ago has significantly changed.
While we continue to wait for the peer review process to do its thing, I’ve taken the liberty to revamp the preprint to reflect the rapidly changing world of training and using SNNs.
The latest version includes “Practical Notes” with black magic tricks that have helped us improve the performance of SNNs, code-snippets that reduce verbose explanations, and a fresh account of some of the latest going-ons in the neuroscience-inspired deep learning world..
Thank you to Gregor Lenz, Xinxin Wang and Max Ward for working through this >50 page monster.
Preprint link here.