SaaSHub helps you find the best software and product alternatives Learn more →
Python ggml Projects
-
inference
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
Project mention: GreptimeAI + Xinference - Efficient Deployment and Monitoring of Your LLM Applications | dev.to | 2024-01-24Xorbits Inference (Xinference) is an open-source platform to streamline the operation and integration of a wide array of AI models. With Xinference, you’re empowered to run inference using any open-source LLMs, embedding models, and multimodal models either in the cloud or on your own premises, and create robust AI-driven applications. It provides a RESTful API compatible with OpenAI API, Python SDK, CLI, and WebUI. Furthermore, it integrates third-party developer tools like LangChain, LlamaIndex, and Dify, facilitating model integration and development.
Project mention: New open-source model with 8k context runs on CPU, outperforms GPT-3 | news.ycombinator.com | 2023-06-30
Python ggml related posts
-
Ask HN: Cheapest way to run local LLMs?
-
Minigpt4 Inference on CPU
-
New open-source model with 8k context runs on CPU, outperforms GPT-3
-
MPT 30B inference code using CPU
-
The Coming of Local LLMs
-
A note from our sponsor - SaaSHub
www.saashub.com | 2 May 2024
Index
Project | Stars | |
---|---|---|
1 | inference | 2,629 |
2 | mpt-30B-inference | 572 |
Sponsored