SaaSHub helps you find the best software and product alternatives Learn more →
Onnx-tensorrt Alternatives
Similar projects and alternatives to onnx-tensorrt
-
server
The Triton Inference Server provides an optimized cloud and edge inferencing solution. (by triton-inference-server)
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
-
TensorRT
NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.
-
jetson-inference
Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.
-
-
deepC
vendor independent TinyML deep learning library, compiler and inference framework microcomputers and micro-controllers
-
-
onnx-tensorrt discussion
onnx-tensorrt reviews and mentions
-
Introducing Cellulose - an ONNX model visualizer with hardware runtime support annotations
[1] - We use onnx-tensorrt for this TensorRT compatibility checks.
-
[P] [D]How to get TensorFlow model to run on Jetson Nano?
Conversion was done from Keras Tensorflow using to ONNX https://github.com/onnx/keras-onnx followed by ONNX to TensorRT using https://github.com/onnx/onnx-tensorrt The Python code used for inference using TensorRT can be found at https://github.com/jonnor/modeld/blob/tensorrt/tensorrtutils.py
-
New to this: could I use Nvidia Nano + lobe?
Hi! You can run the models trained in Lobe on the Jetson Nano, either through TensorFlow (https://docs.nvidia.com/deeplearning/frameworks/install-tf-jetson-platform/index.html), ONNX runtime (https://elinux.org/Jetson_Zoo#ONNX_Runtime), or running ONNX on TensorRT (https://github.com/onnx/onnx-tensorrt).
-
How to install ONNX-TensorRT Python Backend on Jetpack 4.5
Hello, I would like to install https://github.com/onnx/onnx-tensorrt from a package because compiling is a lot of complicated. Is there any source for this package?
-
A note from our sponsor - SaaSHub
www.saashub.com | 16 Jan 2025
Stats
onnx/onnx-tensorrt is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of onnx-tensorrt is C++.