ritual
onnxruntime
Our great sponsors
ritual | onnxruntime | |
---|---|---|
6 | 54 | |
1,196 | 12,656 | |
0.9% | 4.6% | |
0.0 | 10.0 | |
about 1 year ago | 4 days ago | |
Rust | C++ | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ritual
-
Question about including parent directory C++ files in Rust crate
For your inspiration to get c++ code in a crate: https://github.com/rust-qt/examples uses ritual build https://github.com/rust-qt/ritual which integrates qt c++ stuff into the above cargo qt rust examples. I would like to highlight the todo list example. Build and run it verbosely with "--verbose --verbose".
-
CXX-Qt: safe Rust bindings for Qt
It is great to see how many people want to bring Qt support to Rust and are trying to do so, and I hope that these folks succeed, but it’s wearisome to me how they each create a new project instead of working with others who are already in this problem space. Of the half-dozen or so[0] existing attempts so far to create Qt bindings to Rust, none of them have actually succeeded because they’ve either been abandoned midway or limit their support to QML. Ritual[1] is the only crate I’ve seen that attempts to actually expose the whole Qt API, but it’s pretty awful to use, incomplete, and dead.
Rust doesn’t need more Qt crates. It needs one Qt crate that is complete and works well. (Or, ideally, a native Rust cross-platform GUI crate that works as well as Qt, but that’s an even longer and harder task.)
[0] https://lib.rs/search?q=qt
[1] https://github.com/rust-qt/ritual
-
Use a CPP library from Rust
Just wanted to add another vote for https://github.com/rust-qt/ritual that 0OOO00000OO00O0O0OOO/ mentioned below.
-
GUI liblary for qt ?
There was a QT library, rust-qt (that was officially supported I believe), the bindings being made with Ritual. There is an open issue for supporting qt6, which I'm also awaiting; https://github.com/rust-qt/ritual/issues/109.
- Qt 6.2 LTS Released
onnxruntime
-
Machine Learning with PHP
ONNX Runtime: ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
-
AI Inference now available in Supabase Edge Functions
Embedding generation uses the ONNX runtime under the hood. This is a cross-platform inferencing library that supports multiple execution providers from CPU to specialized GPUs.
-
Deep Learning in JavaScript
tfjs is dead, looking at the commit history. The standard now is to convert PyTorch to onnx, then use onnxruntime (https://github.com/microsoft/onnxruntime/tree/main/js/web) to run the model on the browsdr.
- FLaNK Stack 05 Feb 2024
-
Vcc – The Vulkan Clang Compiler
- slang[2] has the potential, but the meta programming part is not as strong as C++, existing libraries cannot be used.
The above conclusion is drawn from my work https://github.com/microsoft/onnxruntime/tree/dev/opencl, purely nightmare to work with thoes drivers and jit compilers. Hopefully Vcc can take compute shader more seriously.
[1]: https://www.circle-lang.org/
-
Oracle-samples/sd4j: Stable Diffusion pipeline in Java using ONNX Runtime
I did. It depends what you want, for an overview of how ONNX Runtime works then Microsoft have a bunch of things on https://onnxruntime.ai, but the Java content is a bit lacking on there as I've not had time to write much. Eventually I'll probably write something similar to the C# SD tutorial they have on there but for the Java API.
For writing ONNX models from Java we added an ONNX export system to Tribuo in 2022 which can be used by anything on the JVM to export ONNX models in an easier way than writing a protobuf directly. Tribuo doesn't have full coverage of the ONNX spec, but we're happy to accept PRs to expand it, otherwise it'll fill out as we need it.
- Mamba-Chat: A Chat LLM based on State Space Models
-
VectorDB: Vector Database Built by Kagi Search
What about models besides GPT? Most of the popular vector encoding models aren't using this architecture.
If you really didn't want PyTorch/Transformers, you could consider exporting your models to ONNX (https://github.com/microsoft/onnxruntime).
- ONNX runtime: Cross-platform accelerated machine learning
- Onnx Runtime: “Cross-Platform Accelerated Machine Learning”
What are some alternatives?
Rust Qt Binding Generator git - Generate bindings to use Rust code in Qt and QML
onnx - Open standard for machine learning interoperability
cxx - Safe interop between Rust and C++
onnx-tensorrt - ONNX-TensorRT: TensorRT backend for ONNX
qt.rs - Qt5 binding for rust language. (stalled)
onnx-simplifier - Simplify your onnx model
QMetaObject crate for Rust - Integrate Qml and Rust by building the QMetaObject at compile time.
ONNX-YOLOv7-Object-Detection - Python scripts performing object detection using the YOLOv7 model in ONNX.
imgui-rs - Rust bindings for Dear ImGui
onnx-tensorflow - Tensorflow Backend for ONNX
conrod - An easy-to-use, 2D GUI library written entirely in Rust.
MLflow - Open source platform for the machine learning lifecycle