The Triton Inference Server provides an optimized cloud and edge inferencing solution. (by triton-inference-server)

Server Alternatives

Similar projects and alternatives to server

  • GitHub repo onnx-tensorrt

    ONNX-TensorRT: TensorRT backend for ONNX

  • GitHub repo DeepSpeed

    server VS DeepSpeed

    DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.

  • Scout APM

    Less time debugging, more time building. Scout APM allows you to find and fix performance issues with no hassle. Now with error monitoring and external services monitoring, Scout is a developer's best friend when it comes to application development.

  • GitHub repo ROCm

    server VS ROCm

    ROCm - Open Source Platform for HPC and Ultrascale GPU Computing

  • GitHub repo Triton

    server VS Triton

    Triton is a Dynamic Binary Analysis (DBA) framework. It provides internal components like a Dynamic Symbolic Execution (DSE) engine, a dynamic taint engine, AST representations of the x86, x86-64, ARM32 and AArch64 Instructions Set Architecture (ISA), SMT simplification passes, an SMT solver interface and, the last but not least, Python bindings. (by JonathanSalwan)

  • GitHub repo tensorflow

    server VS tensorflow

    An Open Source Machine Learning Framework for Everyone

  • GitHub repo TensorRT

    server VS TensorRT

    TensorRT is a C++ library for high performance inference on NVIDIA GPUs and deep learning accelerators.

  • GitHub repo Megatron-LM

    server VS Megatron-LM

    Ongoing research training transformer language models at scale, including: BERT & GPT-2

  • SonarQube

    Static code analysis for 29 languages.. Your projects are multi-language. So is SonarQube analysis. Find Bugs, Vulnerabilities, Security Hotspots, and Code Smells so you can release quality code every time. Get started analyzing your projects today for free.

  • GitHub repo tensorflow-upstream

    TensorFlow ROCm port

  • GitHub repo keras-onnx

    server VS keras-onnx

    Convert tf.keras/Keras models to ONNX

  • GitHub repo storium-backend

    Source code for the web backend for hosting story generation models in the EMNLP 2020 paper "STORIUM: A Dataset and Evaluation Platform for Human-in-the-Loop Story Generation"

  • GitHub repo modeld

    Self driving car lane and path detection

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better server alternative or higher similarity.

Suggest an alternative to server

Reviews and mentions

Posts with mentions or reviews of server. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-11-17.


Basic server repo stats
1 day ago

triton-inference-server/server is an open source project licensed under GNU General Public License v3.0 or later which is an OSI approved license.

OPS - Build and Run Open Source Unikernels
Quickly and easily build and deploy open source unikernels in tens of seconds. Deploy in any language to any cloud.
Find remote jobs at our new job board There are 28 new remote jobs listed recently.
Are you hiring? Post a new remote job listing for free.