Simplest way to deploy Keras NN model into C++?

This page summarizes the projects mentioned and recommended in the original post on

Our great sponsors
  • SonarLint - Deliver Cleaner and Safer Code - Right in Your IDE of Choice!
  • Scout APM - Less time debugging, more time building
  • SaaSHub - Software Alternatives and Reviews
  • ssd_keras

    A Keras port of Single Shot MultiBox Detector

    Don't know about simplest, but we either used caffe or tensorrt, it is maybe a bit difficult to use but I'd actually say simple fast GPU inference is what it's geared towards. There is a keras -> caffe converter here, I think. Caffe is a c++ lib, typical, with dependencies and all. I've never heard anything of tensorflow running on c++. But with tensorrt you should get an "artifact" that you'd load, no matter where it comes from

  • cppflow

    Run TensorFlow models in C++ without installation and without Bazel

    If your re using keras with TensorFlow you can save it as a saved model format and then you can easily use cppflow to perform inference with it.

  • SonarLint

    Deliver Cleaner and Safer Code - Right in Your IDE of Choice!. SonarLint is a free and open source IDE extension that identifies and catches bugs and vulnerabilities as you code, directly in the IDE. Install from your favorite IDE marketplace today.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts