SaaSHub helps you find the best software and product alternatives Learn more →
Lm-inference-engines Alternatives
Similar projects and alternatives to lm-inference-engines
-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
TensorRT-LLM
TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and build TensorRT engines that contain state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT-LLM also contains components to create Python and C++ runtimes that execute those TensorRT engines.
-
pdfGPT
PDF GPT allows you to chat with the contents of your PDF file by using GPT capabilities. The most effective open source solution to turn your pdf files in a chatbot!
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
lm-inference-engines reviews and mentions
- Nvidia's Chat with RTX is a promising AI chatbot that runs locally on your PC
- Open Inference Engines – Feature Comparison of Language Model Inference Engines
- Open Inference Engine Comparison | Features and Functionality of TGI, vLLM, llama.cpp, and TensorRT-LLM
-
A note from our sponsor - SaaSHub
www.saashub.com | 10 May 2024
Stats
lapp0/lm-inference-engines is an open source project licensed under GNU General Public License v3.0 only which is an OSI approved license.
Sponsored