gpu_poor
llama.net
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gpu_poor
-
Ask HN: Cheapest way to run local LLMs?
Here's a simple calculator for LLM inference requirements: https://rahulschand.github.io/gpu_poor/
- How many token/s can I get? A simple GitHub tool to see token/s u can get for a LLM
- Show HN: Can your LLM run this?
llama.net
-
Is there an api for hugging face LLMs
Build one on dotnet with llama.cpp and https://github.com/hpretila/llama.net
- api supported local llm
What are some alternatives?
LLamaStack - ASP.NET Core Web, WebApi & WPF implementations for LLama.cpp & LLamaSharp
LLamaSharp - A C#/.NET library to run LLM models (🦙LLaMA/LLaVA) on your local device efficiently.
chatd - Chat with your documents using local AI
uCat - Hi!👋😸I am uCat, your brain implant assistant. I can help you speak and move again; inside the metaverse.
chitchat - A simple LLM chat front-end that makes it easy to find, download, and mess around with models on your local machine.
SlackAI - Slack LLM app integration
Pacha - "Pacha" TUI (Text User Interface) is a JavaScript application that utilizes the "blessed" library. It serves as a frontend for llama.cpp and provides a convenient and straightforward way to perform inference using local language models.
code-llama-for-vscode - Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
PerroPastor - Run Llama based LLMs in Unity entirely in compute shaders with no dependencies
Introducing .NET Multi-platform App UI (MAUI) - .NET MAUI is the .NET Multi-platform App UI, a framework for building native device applications spanning mobile, tablet, and desktop.