notebooks
Made-With-ML
notebooks | Made-With-ML | |
---|---|---|
2 | 51 | |
24 | 35,801 | |
- | - | |
0.0 | 6.8 | |
over 1 year ago | 5 months ago | |
Jupyter Notebook | Jupyter Notebook | |
- | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
notebooks
-
Neuromorphic learning, working memory, and metaplasticity in nanowire networks
This gives you a ludicrous advantage over current neural net accelerators. Specifically 3-5 orders is magnitude in energy and time, as demonstrated in the BranScaleS system https://www.humanbrainproject.eu/en/science-development/focu...
Unfortunately, that doesn't solve the problem of learning. Just because you can build efficient neuromorphic systems doesn't mean that we know how to train them. Briefly put, the problem is that a physical system has physical constraints. You can't just read the global state in NWN and use gradient descent as we would in deep learning. Rather, we have to somehow use local signals to approximate local behaviour that's helpful on a global scale. That's why they use Hebbian learning in the paper (what fires together, wires together), but it's tricky to get right and I haven't personally seen examples that scale to systems/problems of "interesting" sizes. This is basically the frontier of the field: we need local, but generalizable, learning rules that are stable across time and compose freely into higher-order systems.
Regarding educational material, I'm afraid I haven't seen great entries for learning about SNNs in full generality. I co-author a simulator (https://github.com/norse/norse/) based on PyTorch with a few notebook tutorials (https://github.com/norse/notebooks) that may be helpful.
I'm actually working on some open resources/course material for neuromorphic computing. So if you have any wishes/ideas, please do reach out. Like, what would a newcomer be looking for specifically?
-
Event-Based Backpropagation for Exact Gradients in Spiking Neural Networks
We've written some documentation around our neuron equations in Python that explains this: https://norse.github.io/norse/auto_api/norse.torch.functiona...
See also our tutorial on neuron parameter optimization to understand how it's useful for machine learning: https://github.com/norse/notebooks#level-intermediate
Disclaimer: I'm a co-author of the library Norse
Regarding the target audience, it's actually not entirely clear to me. This lies in the intersection between computational neuroscience and deep learning. Which isn't a huge set of people. Meaning, you're questions are valid and we (as researchers) have a lot of communication to do to explain why this is interesting and important.
Made-With-ML
-
[D] How do you keep up to date on Machine Learning?
Made With ML
- Open-Source Production Machine Learning Course
-
Advice for switching careers within analytics
- Develop a (simple!) ML project and apply MLOps best practices to it. Ask Chat GPT all of your MLOps questions. I've joined this MLOps community and it has been very helpful to know what path to follow in order to be better at MLOps, thanks to them I arrived at madewithml, but I haven't done it yet. But it covers all the MLOps side.
-
Recommendation for MLOps resources
Hey, I’m also working in ML. Here’s a great resource: https://madewithml.com. Also, check out Noah Gift’s book Practical MLOPs.
- Ask HN: Resource to learn how to train and use ML Models
-
Need help to find resources to learn ml ops
Try replicating this setup: https://madewithml.com/
-
MLops Resources
madewithml
-
Ask HN: How do I get started with MLOps?
There's a really nice website by Goku Mohandas called Made With ML. IMO it is the best practical guide to MLOps out there: https://madewithml.com
Incase you want to dive a little deeper, https://fullstackdeeplearning.com/course/2022/ is also something I have been recommended by folks.
- Resources for Current DE Interested in Learning Data Science
-
Do organizations still need machine learning engineers?
madewithml is pretty sweet, especially the MLOps side of things. It'll give you good skills in how development in Python and deploying ML works.
What are some alternatives?
DeepLearningExamples - State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure.
zero-to-mastery-ml - All course materials for the Zero to Mastery Machine Learning and Data Science course.
fastai - The fastai deep learning library
mlops-zoomcamp - Free MLOps course from DataTalks.Club
NYU-DLSP20 - NYU Deep Learning Spring 2020
FLAML - A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.
mlops-course - Learn how to design, develop, deploy and iterate on production-grade ML applications.
practical-mlops-book - [Book-2021] Practical MLOps O'Reilly Book
Copulas - A library to model multivariate data using copulas.
ETCI-2021-Competition-on-Flood-Detection - Experiments on Flood Segmentation on Sentinel-1 SAR Imagery with Cyclical Pseudo Labeling and Noisy Student Training
awesome-mlops - A curated list of references for MLOps
mlattacks - Machine Learning Attack Series