blindai
steelix
blindai | steelix | |
---|---|---|
6 | 2 | |
490 | 36 | |
1.4% | - | |
8.0 | 10.0 | |
about 1 month ago | over 1 year ago | |
Rust | Rust | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
blindai
-
[D] Any options for using GPT models using proprietary data ?
We are working on an open-source project, BlindAI (https://github.com/mithril-security/blindai) to answer exactly that: privacy when sending data to remote AI models.
-
[P] Secret Whisper: Deploy OpenAI Whisper model with privacy using BlindAI
BlindAI (https://github.com/mithril-security/blindai) is an open-source confidential AI deployment. By using secure enclaves (Intel SGX for now, soon AMD SEV and Nvidia Confidential Computing), we provide end-to-end protection for users’ data, even when sending it to the Cloud for AI inference. You can see the gains of BlindAI on the scheme below:
-
[P] Introducing BlindAI, an Open-source, fast and privacy-friendly AI deployment solution. Benefit from state-of-the-art AI without ever revealing your data!
Good thing with enclave is that the hardware protection enable us to use regular AES to secure communication with the enclave, which means no ciphertext expansion and lightweight client side. We do not need to have a complicated client side, we just need a slightly modified TLS client with additional security checks, like remote attestation but you can have a look on our client side it's light (https://github.com/mithril-security/blindai/tree/master/client).
-
BlindAI: fast and privacy-friendly AI deployment solution in Rust
I am glad to introduce BlindAI, an AI deployment solution, leveraging secure enclaves, to make remotely hosted AI models privacy friendly. We leverage the tract project as our inference engine to serve AI models in ONNX format inside an enclave. We also use the Rust SGX SDK to use Rust for our secure enclave for AI.
- BlindAI: Open-source, fast and privacy-friendly AI deployment solution in Rust
steelix
-
Steelix - CLI for ONNX model analysis
repo: https://github.com/FL33TW00D/steelix
-
[P] ONNX model analysis tool in Rust
Check it out here: https://github.com/FL33TW00D/steelix Disclaimer: It's very much early days and may not work 100% for your model!
What are some alternatives?
incubator-teaclave-sgx-sdk - Apache Teaclave (incubating) SGX SDK helps developers to write Intel SGX applications in the Rust programming language, and also known as Rust SGX SDK.
tractjs - Run ONNX and TensorFlow inference in the browser.
onnxruntime-rs - Rust wrapper for Microsoft's ONNX Runtime (version 1.8)
ire - I2P router implementation in Rust
altius - Small ONNX inference runtime written in Rust
rsrl - A fast, safe and easy to use reinforcement learning framework in Rust.
tract - Tiny, no-nonsense, self-contained, Tensorflow and ONNX inference [Moved to: https://github.com/sonos/tract]
whatlang-rs - Natural language detection library for Rust. Try demo online: https://whatlang.org/
ortex - ONNX Runtime bindings for Elixir
L2 - l2 is a fast, Pytorch-style Tensor+Autograd library written in Rust
wonnx - A WebGPU-accelerated ONNX inference run-time written 100% in Rust, ready for native and the web