The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning. Learn more →
Top 11 Rust Onnx Projects
-
burn
Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
wonnx
A WebGPU-accelerated ONNX inference run-time written 100% in Rust, ready for native and the web
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
[package] name = "resnet_burn" version = "0.1.0" edition = "2021" [dependencies] burn = { git = "https://github.com/tracel-ai/burn.git", rev = "75cb5b6d5633c1c6092cf5046419da75e7f74b11", features = ["ndarray"] } burn-import = { git = "https://github.com/tracel-ai/burn.git", rev = "75cb5b6d5633c1c6092cf5046419da75e7f74b11" } image = { version = "0.24.7", features = ["png", "jpeg"] }
Tract is the most well known ML crate in Rust, which I believe can compile to WASM - https://github.com/sonos/tract/. Burn may also be useful - https://github.com/burn-rs/burn.
Project mention: Intel CEO: 'The entire industry is motivated to eliminate the CUDA market' | news.ycombinator.com | 2023-12-14The two I know of are IREE and Kompute[1]. I'm not sure how much momentum the latter has, I don't see it referenced much. There's also a growing body of work that uses Vulkan indirectly through WebGPU. This is currently lagging in performance due to lack of subgroups and cooperative matrix mult, but I see that gap closing. There I think wonnx[2] has the most momentum, but I am aware of other efforts.
[1]: https://kompute.cc/
[2]: https://github.com/webonnx/wonnx
To solve this, we built a native extension in Edge Runtime that enables using ONNX runtime via the Rust interface. This was made possible thanks to an excellent Rust wrapper called Ort:
Project mention: Small inference runtime for deep neural networks | news.ycombinator.com | 2023-07-23
Rust Onnx related posts
- Small inference runtime for deep neural networks
- Are there any ML crates that would compile to WASM?
- WebGPU ONNX inference runtime written in Rust
- rustformers/llm: Run inference for Large Language Models on CPU, with Rust 🦀🚀🦙
-
tract VS burn - a user suggested alternative
2 projects | 25 Mar 2023
- onnxruntime
- Steelix - CLI for ONNX model analysis
-
A note from our sponsor - WorkOS
workos.com | 24 Apr 2024
Index
What are some of the best open-source Onnx projects in Rust? This list will help you:
Project | Stars | |
---|---|---|
1 | burn | 7,020 |
2 | tract | 2,050 |
3 | wonnx | 1,487 |
4 | ort | 542 |
5 | blindai | 489 |
6 | rust-mlops-template | 271 |
7 | onnxruntime-rs | 261 |
8 | altius | 86 |
9 | tractjs | 76 |
10 | steelix | 36 |
11 | yolov5-api-rust | 24 |
Sponsored