tractjs
tract


tractjs | tract | |
---|---|---|
1 | 20 | |
75 | 2,322 | |
- | 2.0% | |
0.0 | 9.9 | |
about 2 years ago | 3 days ago | |
Rust | Rust | |
GNU General Public License v3.0 or later | Apache 2.0/MIT |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
tractjs
-
Run WASM, a client side Python runtime
Tensorflow (and by extension Keras) offload most of the actual work to C++ or C, so having those compile to WebAssembly would (I imagine) be a herculean effort.
Instead, The TF team maintains TFJS, which can run on WebAssembly[0].
There are also tractjs[1], and onnyxjs[2], both of which allow you to run (most) ONNX models (which is an open standard for specifying ML models) using WebAssembly and WebGL(only onnyxjs supports WebGL). A bunch of frameworks (caffe, pytorch, TF) support exporting to/importing from ONNX.
[0] https://blog.tensorflow.org/2020/03/introducing-webassembly-...
[1] https://github.com/bminixhofer/tractjs
[2] https://github.com/microsoft/onnxjs
tract
-
Are there any ML crates that would compile to WASM?
Tract is the most well known ML crate in Rust, which I believe can compile to WASM - https://github.com/sonos/tract/. Burn may also be useful - https://github.com/burn-rs/burn.
-
[Discussion] What crates would you like to see?
tract!!
-
tract VS burn - a user suggested alternative
2 projects | 25 Mar 2023
-
Machine Learning Inference Server in Rust?
we use tract for inference, integrated into our runtime and services.
- onnxruntime
- Rust Native ML Frameworks?
-
Neural networks - what crates to use?
Not for training, but for inference this looks nice: https://github.com/sonos/tract
-
Brain.js: GPU Accelerated Neural Networks in JavaScript
There's also tract, from sonos[0]. 100% rust.
I'm currently trying to use it to do speech recognition with a variant of the Conformer architecture (exported to ONNX).
The final goal is to do it in WASM client-side.
[0] https://github.com/sonos/tract
-
Serving ML at the Speed of Rust
As the article notes, there isn't any official Rust-native support for any common frameworks.
tract (https://github.com/sonos/tract) seems like the most mature for ONNX (for which TF/PT export is good nowadays), and recently it successfully implemented BERT.
-
Run deep neural network models from scratch
There are some DL libraries written in Rust: https://github.com/sonos/tract , https://docs.rs/neuronika/latest/neuronika/index.html . The second one could be used for training, I think.
What are some alternatives?
neuronika - Tensors and dynamic neural networks in pure Rust.
onnxruntime-rs - Rust wrapper for Microsoft's ONNX Runtime (version 1.8)
wonnx - A WebGPU-accelerated ONNX inference run-time written 100% in Rust, ready for native and the web
run-wasm - Run WASM based code executions in the browser easily
bevy_webgl2 - WebGL2 renderer plugin for Bevy game engine
steelix - Your one stop CLI for ONNX model analysis.
linfa - A Rust machine learning framework.
kosmonaut - A web browser engine for the space age :rocket:
MTuner - MTuner is a C/C++ memory profiler and memory leak finder for Windows, PlayStation 3/4/5, Nintendo Switch, Android and other platforms
onnxjs - ONNX.js: run ONNX models using JavaScript
gamma - Computational graphs with reverse automatic differentation in the GPU

