Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
LLamaSharp Alternatives
Similar projects and alternatives to LLamaSharp
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
LocalAI
:robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
-
FastChat
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
-
easydiffusion
Easiest 1-click way to create beautiful artwork on your PC using AI, with no tech knowledge. Provides a browser UI for generating images from text prompts and images. Just enter your text prompt, and see the generated image.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
llama-node
Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
LLamaSharp reviews and mentions
-
This is getting really complicated.
For example, I have my own task and I need another tool, so I search and find what I need. https://github.com/SciSharp/LLamaSharp and this allows me to take the next step https://github.com/Xsanf/LLaMa_Unity . I can already run LLM on Unity. And this is already an opportunity to use it in games natively.
-
cannot for the life of me compile libllama.dll
I searched through GitHub and nothing comes up that is new. I wanted to run the model through the C# wrapper linked on LLaMASharp which requires compiling llama.cpp and extracting the libllama dll into the C# project files. When I build llama.cpp with OpenBLAS, everything shows up fine in the command line. Just as the link suggests I make sure to set DBUILD_SHARED_LIBS=ON when in CMake. However, the output in the Visual Studio Developer Command Line interface ignores the setup for libllama.dll in the CMakeFiles.txt entirely. The only dll to compile is llama.dll; I know this is a fairly technical question but does anyone know how to fix?
-
Could I get a suggestion for a simple HTTP API with no GUI for llama.cpp?
C#/.NET: SciSharp/LLamaSharp
-
A note from our sponsor - InfluxDB
www.influxdata.com | 3 May 2024
Stats
SciSharp/LLamaSharp is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of LLamaSharp is C#.
Sponsored