tch-rs
candle
Our great sponsors
tch-rs | candle | |
---|---|---|
37 | 17 | |
3,843 | 13,376 | |
- | 8.3% | |
7.5 | 9.9 | |
4 days ago | 7 days ago | |
Rust | Rust | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
tch-rs
- Tch-Rs
-
Llama2.rs: One-file Rust implementation of Llama2
I wanted to do something like this but then I would miss on proper CUDA acceleration and lose performance compared to using torchlib.
I wrote a forgettable llama implementation for https://github.com/LaurentMazare/tch-rs (pytorch's torchlib rust binding).
-
Playing Atari Games in OCaml
I first encountered OCaml's PyTorch bindings because apparently they generate a C wrapper around PyTorch's C++ API, and Rust's PyTorch bindings use OCaml's C wrapper. See: https://github.com/LaurentMazare/tch-rs
-
llm: a Rust crate/CLI for CPU inference of LLMs, including LLaMA, GPT-NeoX, GPT-J and more
You could try looking at the min-GPT example of tch-rs. I'd also strongly suggest watching Karpathy's video to understand what's going on.
-
Simply explained: How does GPT work?
If you pefer to see it in code there's a succint gpt implementation here https://github.com/LaurentMazare/tch-rs/blob/main/examples/m...
-
Will I ever need python again if I learn rust other than for AI stuff?
Rust is fully compatible w/ C bindings, so even Python libraries written in C can be easily set up to work in Rust (and have been). For example, see PyTorch Rust bindings, which actually works faster than in Python because all of the glue code around the C++ API is in Rust instead of Python.
-
A Rust client library for interacting with Microsoft Airsim https://github.com/Sollimann/airsim-client
Pytorch
- [D] HuggingFace in Julia or Rust ?
- This year I tried solving AoC using Rust, here are my impressions coming from Python!
-
[Help Needed] Deployment of torchscript using rust
I have looked into this a bit and found some crates which help in loading torchscript models called tch-rs
candle
-
karpathy/llm.c
Candle already exists[1], and it runs pretty well. Can use both CUDA and Metal backends (or just plain-old CPU).
[1] https://github.com/huggingface/candle
- Best alternative for python
-
Is there any LLM that can be installed with out python
Check out Candle! It's a Deep Learning framework for Rust. You can run LLMs in binaries.
-
Announcing Kalosm - an local first AI meta-framework for Rust
Kalosm is a meta-framework for AI written in Rust using candle. Kalosm supports local quantized large language models like Llama, Mistral, Phi-1.5, and Zephyr. It also supports other quantized models like Wuerstchen, Segment Anything, and Whisper. In addition to local models, Kalosm supports remote models like GPT-4 and ada embeddings.
-
RFC: candle-lora
I have been working on a machine learning library called candle-lora for Candle. It implementes a technique called LoRA (low rank adaptation), which allows you to reduce a model's trainable parameter count by wrapping and freezing old layers.
-
ExecuTorch: Enabling On-Device interference for embedded devices
[2] https://github.com/huggingface/candle/issues/313
-
[P] Open-source project to run locally LLMs in browser, such as Phi-1.5 for fully private inference
We provide full local inference in browser, by using libraries from Hugging Face like transformers.js or candle for WASM inference.
-
Update on the Candle ML framework.
We've first announced Candle, a minimalist ML framework in Rust 6 weeks ago. Since then we've focused on adding various recent models and improved the framework so as to support the necessary features in an efficient way. You can checkout a gallery of the examples, supported models include:
-
Should I Haskell or OCaml?
How did you select those two as your options?
I'm just a hobbyist that enjoys programming, and I eventually wanted to expand beyond python. I looked at Haskell and read Learn You a Haskell and did some Exercism exercises but never got anywhere close to being able to use it for real projects. Have been trying to learn about Lisp lately and feel like I've come to a similar dead end.
On the other hand, both Go and Rust have felt fulfilling and practical, with static typing and solid tooling, cross compilations, static binaries, and dependency management that is just a huge breath of fresh air coming from python.
The ML / data science scene is nowhere near as developed as in Python, and I still lean on jupyter/polars/PyTorch here, but I think the candle project[0] seems very interesting. Compiling whisper down to a single CUDA-leveraging binary for fast local transcription is pretty cool!
[0]: https://github.com/huggingface/candle
- Minimalist ML framework for Rust
What are some alternatives?
onnxruntime - ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
Universal-G-Code-Sender - A cross-platform G-Code sender for GRBL, Smoothieware, TinyG and G2core.
cbindgen - A project for generating C bindings from Rust code
burn - Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals. [Moved to: https://github.com/Tracel-AI/burn]
wtpsplit - Code for Where's the Point? Self-Supervised Multilingual Punctuation-Agnostic Sentence Segmentation
bCNC - GRBL CNC command sender, autoleveler and g-code editor
veloren - An open world, open source voxel RPG inspired by Dwarf Fortress and Cube World. This repository is a mirror. Please submit all PRs and issues on our GitLab page.
gsender - Connect to and control Grbl-based CNCs with ease
cncjs - A web-based interface for CNC milling controller running Grbl, Marlin, Smoothieware, or TinyG.
rustlearn - Machine learning crate for Rust
cncjs-kt-ext - Auto-leveling extension for CNCjs