blindai
incubator-teaclave-sgx-sdk
Our great sponsors
blindai | incubator-teaclave-sgx-sdk | |
---|---|---|
6 | 4 | |
489 | 1,148 | |
1.2% | 1.1% | |
8.0 | 3.4 | |
about 1 month ago | about 1 month ago | |
Rust | Rust | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
blindai
-
[D] Any options for using GPT models using proprietary data ?
We are working on an open-source project, BlindAI (https://github.com/mithril-security/blindai) to answer exactly that: privacy when sending data to remote AI models.
-
[P] Secret Whisper: Deploy OpenAI Whisper model with privacy using BlindAI
BlindAI (https://github.com/mithril-security/blindai) is an open-source confidential AI deployment. By using secure enclaves (Intel SGX for now, soon AMD SEV and Nvidia Confidential Computing), we provide end-to-end protection for users’ data, even when sending it to the Cloud for AI inference. You can see the gains of BlindAI on the scheme below:
-
[P] Introducing BlindAI, an Open-source, fast and privacy-friendly AI deployment solution. Benefit from state-of-the-art AI without ever revealing your data!
Good thing with enclave is that the hardware protection enable us to use regular AES to secure communication with the enclave, which means no ciphertext expansion and lightweight client side. We do not need to have a complicated client side, we just need a slightly modified TLS client with additional security checks, like remote attestation but you can have a look on our client side it's light (https://github.com/mithril-security/blindai/tree/master/client).
-
BlindAI: fast and privacy-friendly AI deployment solution in Rust
I am glad to introduce BlindAI, an AI deployment solution, leveraging secure enclaves, to make remotely hosted AI models privacy friendly. We leverage the tract project as our inference engine to serve AI models in ONNX format inside an enclave. We also use the Rust SGX SDK to use Rust for our secure enclave for AI.
- BlindAI: Open-source, fast and privacy-friendly AI deployment solution in Rust
incubator-teaclave-sgx-sdk
-
How to protect information in memory?
https://github.com/apache/incubator-teaclave-sgx-sdk would be a create for this.
-
BlindAI: fast and privacy-friendly AI deployment solution in Rust
I am glad to introduce BlindAI, an AI deployment solution, leveraging secure enclaves, to make remotely hosted AI models privacy friendly. We leverage the tract project as our inference engine to serve AI models in ONNX format inside an enclave. We also use the Rust SGX SDK to use Rust for our secure enclave for AI.
-
Cargo patches
I am wondering something. If you have a look to this Cargo.toml file on github you could see at line 10 that the crate "sgx_crypto_helper" is declared as a dependency which is here a git repository. However you can see at line 19 that the same dependency "sgx_crypto_helper" is also declared with basically the same path... so I'm wondering what is the purpose of this?
What are some alternatives?
onnxruntime-rs - Rust wrapper for Microsoft's ONNX Runtime (version 1.8)
confidential-computing-zoo - Confidential Computing Zoo provides confidential computing solutions based on Intel SGX, TDX, HEXL, etc. technologies.
ire - I2P router implementation in Rust
deno - A modern runtime for JavaScript and TypeScript.
rsrl - A fast, safe and easy to use reinforcement learning framework in Rust.
tract - Tiny, no-nonsense, self-contained, Tensorflow and ONNX inference
steelix - Your one stop CLI for ONNX model analysis.
rust - Empowering everyone to build reliable and efficient software.
whatlang-rs - Natural language detection library for Rust. Try demo online: https://whatlang.org/
incubator-teaclave-trustzone-sdk - Teaclave TrustZone SDK enables safe, functional, and ergonomic development of trustlets.
L2 - l2 is a fast, Pytorch-style Tensor+Autograd library written in Rust
bat - A cat(1) clone with wings.