mamba
openvino_notebooks
mamba | openvino_notebooks | |
---|---|---|
15 | 79 | |
9,506 | 1,972 | |
15.3% | 4.2% | |
8.1 | 9.9 | |
8 days ago | 2 days ago | |
Python | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
mamba
-
Based: Simple linear attention language models
> how the recall can grow unbounded with no tradeoff
this? https://github.com/state-spaces/mamba/issues/175
-
Mamba: The Easy Way
If you want to learn this stuff as a computer engineer, you can read the code here [0]. I find the math quite helpful.
[0]: https://github.com/state-spaces/mamba
- FLaNK Stack 05 Feb 2024
- Introduction to State Space Models (SSM)
-
Fortran inference code for the Mamba state space language model
This model was discussed recently: https://news.ycombinator.com/item?id=38522428 It's a new kind of ML model architecture that can be used instead of a transformer in LLMs.
See also the original repo from the paper: https://github.com/state-spaces/mamba
-
Mamba outperforms transformers "everywhere we tried"
[2] - https://github.com/state-spaces/mamba
Out of curiosity, does anyone feel as though there's any benefit to linking to reddit when we can link to whatever the link is? I for one do not click the link and read discussion on reddit - if I wanted that sort of discussion, I would browse there, not HN.
- GitHub – State-Spaces/Mamba
-
Generate valid JSON with Mamba models
The library is compatible with any auto-regressive model, not transformers. To prove our point we integrated Mamba, a new state-space model architecture, to the library. Try it out!
-
[D] Thoughts on Mamba?
I ran the NanoGPT of Karparthy replacing Self-Attention with Mamba on his TinyShakespeare Dataset and within 5 minutes it started spitting out the following:
-
Mamba-Chat: A Chat LLM based on State Space Models
You might have come across the paper Mamba paper in the last days, which was the first attempt at scaling up state space models to 2.8B parameters to work on language data.
openvino_notebooks
- FLaNK AI Weekly 18 March 2024
- FLaNK Stack Weekly 19 Feb 2024
- FLaNK Stack Weekly 12 February 2024
- FLaNK Stack 05 Feb 2024
-
Optimum Intel OpenVino Performance
Also, credits for using zram in your VM setup; that's a smart hack for memory management. Have you tried tweaking other models like the ones in this OpenVINO notebook?
- FLaNK Stack Weekly 06 Nov 2023
- Trouvez-la plus vite
- Change your voice. FreeVC offers one-shot voice conversion, no text transcript required. Explore how OpenVINO powers AI solutions, see the code on GitHub.
- Vous aurez la banane
- Voyez l’invisible
What are some alternatives?
miniforge - A conda-forge distribution.
chdb - chDB is an embedded OLAP SQL Engine 🚀 powered by ClickHouse
pip - The Python package installer
deepeval - The LLM Evaluation Framework
llm.f90 - LLM inference in Fortran
super-gradients - Easily train or fine-tune SOTA computer vision models with one open source training library. The home of Yolo-NAS.
conda - A system-level, binary package and environment manager running on all major operating systems and platforms.
open_model_zoo - Pre-trained Deep Learning models and demos (high quality and extremely fast)
mamba-chat - Mamba-Chat: A chat LLM based on the state-space model architecture 🐍
starcoder - Home of StarCoder: fine-tuning & inference!
spack - A flexible package manager that supports multiple versions, configurations, platforms, and compilers.
FLiPStackWeekly - FLaNK AI Weekly covering Apache NiFi, Apache Flink, Apache Kafka, Apache Spark, Apache Iceberg, Apache Ozone, Apache Pulsar, and more...