SaaSHub helps you find the best software and product alternatives Learn more →
Gpu_poor Alternatives
Similar projects and alternatives to gpu_poor
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
-
-
swiss_army_llama
A FastAPI service for semantic text search using precomputed embeddings and advanced similarity measures, with built-in support for various file types through textract.
-
-
-
chitchat
A simple LLM chat front-end that makes it easy to find, download, and mess around with models on your local machine. (by clarkmcc)
-
-
Pacha
"Pacha" TUI (Text User Interface) is a JavaScript application that utilizes the "blessed" library. It serves as a frontend for llama.cpp and provides a convenient and straightforward way to perform inference using local language models.
-
code-llama-for-vscode
Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
-
-
gpu_poor discussion
gpu_poor reviews and mentions
-
Ask HN: Cheapest way to run local LLMs?
Here's a simple calculator for LLM inference requirements: https://rahulschand.github.io/gpu_poor/
- How many token/s can I get? A simple GitHub tool to see token/s u can get for a LLM
- Show HN: Can your LLM run this?
-
A note from our sponsor - SaaSHub
www.saashub.com | 7 Dec 2024
Stats
The primary programming language of gpu_poor is JavaScript.