SaaSHub helps you find the best software and product alternatives Learn more →
Onnx-simplifier Alternatives
Similar projects and alternatives to onnx-simplifier
-
-
InfluxDB
InfluxDB – Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
-
PaddleOCR
Awesome multilingual OCR toolkits based on PaddlePaddle (practical ultra lightweight OCR system, support 80+ languages recognition, provide data annotation and synthesis tools, support training and deployment among server, mobile, embedded and IoT devices)
-
-
nn
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
-
-
-
ncnn
ncnn is a high-performance neural network inference framework optimized for the mobile platform
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
-
-
opencv-mobile
The minimal opencv for Android, iOS, ARM Linux, Windows, Linux, MacOS, HarmonyOS, WebAssembly, watchOS, tvOS, visionOS
-
-
-
PaddleOCR2Pytorch
PaddleOCR inference in PyTorch. Converted from [PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR)
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
onnx-simplifier discussion
onnx-simplifier reviews and mentions
-
Show: Cross-platform Image segmentation on video using eGUI, onnxruntime and ffmpeg
onnx-simplifier can shed some of incompatibilities in widespread use, but is itself bug ridden and lagging behind the standard. For any serious model, or when you don't get lucky simplifying the model upstream, you'd generally want good support of opset 11.
-
[Technical Article] OCR Upgrade
ONNX Simplifier:https://github.com/daquexian/onnx-simplifier
-
PyTorch 1.10
As far as I know, the ONNX format won't give you a performance boost on its own. However, there are ONNX optimizers for the ONNX runtime which will speed up your inference.
But if you are using Nvidia Hardware, then TensorRT should give you the best performance possible, especially if you change the precision level. Don't forget to simplify your ONNX model before you converting it to TensorRT though: https://github.com/daquexian/onnx-simplifier
-
A note from our sponsor - SaaSHub
www.saashub.com | 22 May 2025
Stats
daquexian/onnx-simplifier is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of onnx-simplifier is C++.