llama.net VS gpu_poor

Compare llama.net vs gpu_poor and see what are their differences.

llama.net

.NET wrapper for LLaMA.cpp for LLaMA language model inference on CPU. 🦙 (by hpretila)

gpu_poor

Calculate token/s & GPU memory requirement for any LLM. Supports llama.cpp/ggml/bnb/QLoRA quantization (by RahulSChand)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
llama.net gpu_poor
2 3
53 650
- -
4.0 8.3
about 1 year ago 7 months ago
C# JavaScript
MIT License -
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

llama.net

Posts with mentions or reviews of llama.net. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-06-02.

gpu_poor

Posts with mentions or reviews of gpu_poor. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-11-26.

What are some alternatives?

When comparing llama.net and gpu_poor you can also consider the following projects:

LLamaSharp - A C#/.NET library to run LLM models (🦙LLaMA/LLaVA) on your local device efficiently.

LLamaStack - ASP.NET Core Web, WebApi & WPF implementations for LLama.cpp & LLamaSharp

uCat - Hi!👋😸I am uCat, your brain implant assistant. I can help you speak and move again; inside the metaverse.

chatd - Chat with your documents using local AI

SlackAI - Slack LLM app integration

chitchat - A simple LLM chat front-end that makes it easy to find, download, and mess around with models on your local machine.

Pacha - "Pacha" TUI (Text User Interface) is a JavaScript application that utilizes the "blessed" library. It serves as a frontend for llama.cpp and provides a convenient and straightforward way to perform inference using local language models.

PerroPastor - Run Llama based LLMs in Unity entirely in compute shaders with no dependencies

code-llama-for-vscode - Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.

Introducing .NET Multi-platform App UI (MAUI) - .NET MAUI is the .NET Multi-platform App UI, a framework for building native device applications spanning mobile, tablet, and desktop.