LLamaSharp
LLamaSharp | LLaMa_Unity | |
---|---|---|
3 | 1 | |
2,015 | 5 | |
14.0% | - | |
9.8 | 5.6 | |
3 days ago | 11 months ago | |
C# | ||
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
LLamaSharp
-
This is getting really complicated.
For example, I have my own task and I need another tool, so I search and find what I need. https://github.com/SciSharp/LLamaSharp and this allows me to take the next step https://github.com/Xsanf/LLaMa_Unity . I can already run LLM on Unity. And this is already an opportunity to use it in games natively.
-
cannot for the life of me compile libllama.dll
I searched through GitHub and nothing comes up that is new. I wanted to run the model through the C# wrapper linked on LLaMASharp which requires compiling llama.cpp and extracting the libllama dll into the C# project files. When I build llama.cpp with OpenBLAS, everything shows up fine in the command line. Just as the link suggests I make sure to set DBUILD_SHARED_LIBS=ON when in CMake. However, the output in the Visual Studio Developer Command Line interface ignores the setup for libllama.dll in the CMakeFiles.txt entirely. The only dll to compile is llama.dll; I know this is a fairly technical question but does anyone know how to fix?
-
Could I get a suggestion for a simple HTTP API with no GUI for llama.cpp?
C#/.NET: SciSharp/LLamaSharp
LLaMa_Unity
-
This is getting really complicated.
For example, I have my own task and I need another tool, so I search and find what I need. https://github.com/SciSharp/LLamaSharp and this allows me to take the next step https://github.com/Xsanf/LLaMa_Unity . I can already run LLM on Unity. And this is already an opportunity to use it in games natively.
What are some alternatives?
SillyTavern - LLM Frontend for Power Users.
tortoise-tts - A multi-voice TTS system trained with an emphasis on quality
llama.cpp-dotnet - Minimal C# bindings for llama.cpp + .NET core library with API host/client.
llama.net - .NET wrapper for LLaMA.cpp for LLaMA language model inference on CPU. 🦙
SciSharp-Stack-Examples - Practical examples written in SciSharp's machine learning libraries
LLamaStack - ASP.NET Core Web, WebApi & WPF implementations for LLama.cpp & LLamaSharp
LocalAI - :robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
llama-node - Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
go-llama.cpp - LLama.cpp golang bindings
agency - A fast and minimal framework for building agent-integrated systems
ask-chat-gpt - [Archived - bot was banned by Reddit during API changes] Experimental reddit bot that builds replies using OpenAI's API
SlackAI - Slack LLM app integration