Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR. Learn more →
Onnxruntime Alternatives
Similar projects and alternatives to onnxruntime
-
-
CodeRabbit
CodeRabbit: AI Code Reviews for Developers. Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR.
-
txtai
đź’ˇ All-in-one open-source embeddings database for semantic search, LLM orchestration and language model workflows
-
-
-
-
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
-
DirectXShaderCompiler
This repo hosts the source for the DirectX Shader Compiler which is based on LLVM/Clang.
-
transformers.js
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
-
-
-
TensorRT
NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.
-
-
wonnx
A WebGPU-accelerated ONNX inference run-time written 100% in Rust, ready for native and the web
-
-
-
-
-
vectordb
A minimal Python package for storing and retrieving text using chunking, embeddings, and vector search. (by kagisearch)
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
onnxruntime discussion
onnxruntime reviews and mentions
-
📝✨ClearText
Leveraged ONNX Runtime for optimized inference across different hardware
-
Day 49: Serving LLMs with ONNX Runtime
ONNX Runtime Documentation: ONNX Runtime
-
Show HN: Remove-bg – open-source remove background using WebGPU
First time I see this, by the first impression I think the output quality of this one is better than mine and my code is only based on one model
In my understanding, It'll possible if the model's author build to https://onnxruntime.ai ONNX Runtime. And maybe the downside is user will need to download ton of data to their device, currently it's ~100-200mb
-
Running Phi-3-vision via ONNX on Jetson Platform
ONNX Runtime is a high-performance inference engine for executing ONNX (Open Neural Network Exchange) models. It provides a simple way to run large language models like Llama, Phi, Gemma, and Mistral via the onnxruntime-genai API.
-
SamGIS - Segment Anything applied to GIS
Starting from version 1.5.1 the backend integrates changes borrowed from sam_onnx_full_export, to support OnnxRuntime 1.17.x and later versions. Please note that on MacOS directly running the project from the command line suffers from memory leaks, making inference operations slower than normal. It's best therefore running the project inside a docker container, unless in case of development or debugging activities.
- SamGIS - Segment Anything adattato al GIS
-
Giving Odin Intelligence
I've found a suitable for my idea ONNX example. I'm going to use this example as a strong foundation for the project. But to make things more interesting I'll add just a few enhancements:
- New exponent functions that make SiLU and SoftMax 2x faster, at full acc
-
Machine Learning with PHP
ONNX Runtime: ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
-
AI Inference now available in Supabase Edge Functions
Embedding generation uses the ONNX runtime under the hood. This is a cross-platform inferencing library that supports multiple execution providers from CPU to specialized GPUs.
-
A note from our sponsor - CodeRabbit
coderabbit.ai | 7 Feb 2025
Stats
microsoft/onnxruntime is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of onnxruntime is C++.