Sonar helps you commit clean C++ code every time. With over 550 unique rules to find C++ bugs, code smells & vulnerabilities, Sonar finds the issues while you focus on the work. Learn more →
Similar projects and alternatives to serving
The Triton Inference Server provides an optimized cloud and edge inferencing solution. (by triton-inference-server)
Serve, optimize and scale PyTorch models in production (by pytorch)
Write Clean C++ Code. Always.. Sonar helps you commit clean C++ code every time. With over 550 unique rules to find C++ bugs, code smells & vulnerabilities, Sonar finds the issues while you focus on the work.
A C++ standalone library for machine learning (by flashlight)
The Julia Programming Language
Julia on TPUs (by JuliaTPU)
OneFlow is a deep learning framework designed to be user-friendly, scalable and efficient.
Python + Inference - Model Deployment library in Python. Simplest model inference server ever.
Access the most powerful time series database as a service. Ingest, store, & analyze all types of time series data in a fully-managed, purpose-built database. Keep data forever with low-cost storage and superior data compression.
MNN is a blazing fast, lightweight deep learning framework, battle-tested by business-critical use cases in Alibaba
An Open Source Machine Learning Framework for Everyone
Compiler for Neural Network hardware accelerators (by pytorch)
Build and publish crates with pyo3, rust-cpython and cffi bindings as well as rust binaries as python packages
A performant and modular runtime for TensorFlow (by tensorflow)
eksctl and k8s utilities
serving reviews and mentions
Would you use maturin for ML model serving?
2 projects | reddit.com/r/rust | 8 Jul 2022
Which ML framework do you use? Tensorflow has https://github.com/tensorflow/serving. You could also use the Rust bindings to load a saved model and expose it using one of the Rust HTTP servers. It doesn't matter whether you trained your model in Python as long as you export its saved model.
Popular Machine Learning Deployment Tools
4 projects | dev.to | 16 Apr 2022
If data science uses a lot of computational power, then why is python the most used programming language?
6 projects | reddit.com/r/learnmachinelearning | 13 Apr 2022
You serve models via https://www.tensorflow.org/tfx/guide/serving which is written entirely in C++ (https://github.com/tensorflow/serving/tree/master/tensorflow_serving/model_servers), no Python on the serving path or in the shipped product.
Exposing Tensorflow Serving’s gRPC Endpoints on Amazon EKS
2 projects | dev.to | 10 Feb 2021
gRPC only connects to a host and port — but we can use whatever service route we want. Above I use the path we configured in our k8s ingress object: /service1, and overwrite the base configuration provided by tensorflow serving. When we call the tfserving_metadata function above, we specify /service1 as an argument.
A note from our sponsor - Sonar
www.sonarsource.com | 24 Mar 2023
tensorflow/serving is an open source project licensed under Apache License 2.0 which is an OSI approved license.