snntorch
norse
snntorch | norse | |
---|---|---|
2 | 6 | |
1,085 | 613 | |
- | 2.0% | |
9.2 | 6.4 | |
10 days ago | 8 days ago | |
Python | Python | |
MIT License | GNU Lesser General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
snntorch
-
Bio inspired computer vision
Spiking Neural Networks (SNNs): neural networks that use spiking neurons (i.e. neurons that communicate using asynchronous binary spikes similarly to biological neurons) instead of artificial neurons. Apart from this particularity, SNNs can be organized in any kind of topology we all know, like CNNs, ViT, etc. There are tons of approaches to train SNNs, like bio-inspired learning rules (STDP, three factor rules, etc) or adaptations of backprop (which remains the SOTA in a lot of vision tasks). A good resource to begin with backprop-trained SNNs: https://snntorch.readthedocs.io/en/latest/ .
-
How to train brain-inspired spiking neural networks using lessons from deep learning. Interactive Colab notebook links in thread.
Github: https://github.com/jeshraghian/snntorch
norse
-
Neuromorphic learning, working memory, and metaplasticity in nanowire networks
This gives you a ludicrous advantage over current neural net accelerators. Specifically 3-5 orders is magnitude in energy and time, as demonstrated in the BranScaleS system https://www.humanbrainproject.eu/en/science-development/focu...
Unfortunately, that doesn't solve the problem of learning. Just because you can build efficient neuromorphic systems doesn't mean that we know how to train them. Briefly put, the problem is that a physical system has physical constraints. You can't just read the global state in NWN and use gradient descent as we would in deep learning. Rather, we have to somehow use local signals to approximate local behaviour that's helpful on a global scale. That's why they use Hebbian learning in the paper (what fires together, wires together), but it's tricky to get right and I haven't personally seen examples that scale to systems/problems of "interesting" sizes. This is basically the frontier of the field: we need local, but generalizable, learning rules that are stable across time and compose freely into higher-order systems.
Regarding educational material, I'm afraid I haven't seen great entries for learning about SNNs in full generality. I co-author a simulator (https://github.com/norse/norse/) based on PyTorch with a few notebook tutorials (https://github.com/norse/notebooks) that may be helpful.
I'm actually working on some open resources/course material for neuromorphic computing. So if you have any wishes/ideas, please do reach out. Like, what would a newcomer be looking for specifically?
-
[D] The Complete Guide to Spiking Neural Networks
Surrogate gradients and BPTT, this is what is implemented in Norse https://github.com/Norse/Norse. It is also possible to compute exact gradients using the Eventprop algorithm.
- [P] Norse - Deep learning with spiking neural networks (SNNs) in PyTorch
- Show HN: Deep learning with spiking neural networks (SNNs) in PyTorch
-
Don't Mess with Backprop: Doubts about Biologically Plausible Deep Learning
That repo is slightly outdated, development now continues at https://github.com/norse/norse.
What are some alternatives?
spikingjelly - SpikingJelly is an open-source deep learning framework for Spiking Neural Network (SNN) based on PyTorch.
Spiking-Neural-Network - Pure python implementation of SNN
bindsnet - Simulation of spiking neural networks (SNNs) using PyTorch.
pytorch-forecasting - Time series forecasting with PyTorch
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
TorchGA - Train PyTorch Models using the Genetic Algorithm with PyGAD
Kilosort - Fast spike sorting with drift correction for up to a thousand channels
ocaml-torch - OCaml bindings for PyTorch
pycox - Survival analysis with PyTorch
Neuromorphic-Computing-Guide - Learn about the Neumorphic engineering process of creating large-scale integration (VLSI) systems containing electronic analog circuits to mimic neuro-biological architectures.