keras-onnx VS onnx-tensorrt

Compare keras-onnx vs onnx-tensorrt and see what are their differences.

keras-onnx

Convert tf.keras/Keras models to ONNX (by onnx)

onnx-tensorrt

ONNX-TensorRT: TensorRT backend for ONNX (by onnx)
Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
keras-onnx onnx-tensorrt
2 4
329 2,749
- 2.1%
4.4 4.1
over 2 years ago 24 days ago
Python C++
Apache License 2.0 Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

keras-onnx

Posts with mentions or reviews of keras-onnx. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-03-30.

onnx-tensorrt

Posts with mentions or reviews of onnx-tensorrt. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-06-02.

What are some alternatives?

When comparing keras-onnx and onnx-tensorrt you can also consider the following projects:

RobustVideoMatting - Robust Video Matting in PyTorch, TensorFlow, TensorFlow.js, ONNX, CoreML!

onnxruntime - ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator

MMdnn - MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch Onnx and CoreML.

TensorRT - NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.

pytorch2keras - PyTorch to Keras model convertor

jetson-inference - Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.

keras2cpp - This is a bunch of code to port Keras neural network model into pure C++.

server - The Triton Inference Server provides an optimized cloud and edge inferencing solution.

cppflow - Run TensorFlow models in C++ without installation and without Bazel

deepC - vendor independent TinyML deep learning library, compiler and inference framework microcomputers and micro-controllers

modeld - Self driving car lane and path detection