Top 11 llama-cpp Open-Source Projects
-
maid
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely. (by Mobile-Artificial-Intelligence)
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
llama-server-playground
little single file fronted for llama.cpp/examples/server created with vue-taildwincss and flask
You might reuse simple LLaMA tokenizer right in your Go code, look there:
https://github.com/gotzmann/llama.go/blob/8cc54ca81e6bfbce25...
Project mention: Phi-3 Technical a Highly Capable Language Model Locally on Your Phone | news.ycombinator.com | 2024-04-23I've been trying this app but haven't had any luck getting it to actually generate text yet:
https://github.com/Mobile-Artificial-Intelligence/maid
The UI looks nice and includes a native compilation of llama.cpp.
My main phone's screen broke so I'm on an old Pixel 4 until it's repaired but I've had no luck getting 2-3GB models to run so far.
Project mention: Are you sure you are focusing on the right things? (venting) | /r/LocalLLaMA | 2023-07-11The easiest tool I found is CatAI: https://github.com/ido-pluto/catai You just type 3 npm commands and THATS IT! You have your own Chat Web UI on your computer without hundrets of settings
Project mention: Show HN: Collider – the platform for local LLM debug and inference at warp speed | news.ycombinator.com | 2023-11-30
Hi, i just share with my open source project i create for solo roleplaying with a llm (like chatgpt but in offline) and stable diffusion for generate images, try my repo and say if anything is wrong or if you have idea or suggestions :) https://github.com/rbourgeat/ImpAI
Project mention: Go, Python, Rust, and production AI applications | news.ycombinator.com | 2024-03-12I switched from python to rust for my AI stuff. Honestly, I don't care about the things people say rust is used for. I like it because the package manager, testing, and typings being built into the ecosystem by default makes it so easy to build. VS Python where it all can be done, but you need to then maintain all of those separate tools. The overhead of writing Rust is less than the overhead of dealing with the Python ecosystem. And then you have all the benefits of Rust everyone mentions more often... one other thing no one mentions is the feedback loop between a strongly typed language and copilots ability to more accurately generate code.
That being said, there is a real shortage of Rust software for Rust only projects. I ended up writing a wrapper for Llama.cpp and open ai API [0] because I needed it and couldn't find anything out there. Eventually, I do intend to implement Hugging Face's Candle library [1] (A rust version of Torch). There is something appealing about doing everything in a single lang especially as the monopoly of CUDA inevitably gets chipped away.
[0] https://github.com/ShelbyJenkins/llm_client
Link of the playground fronted: https://github.com/hwpoison/llama-server-playground/tree/main/frontend (you can get the windows release from here).
llama-cpp related posts
Index
What are some of the best open-source llama-cpp projects? This list will help you:
Project | Stars | |
---|---|---|
1 | llama.go | 1,165 |
2 | maid | 691 |
3 | catai | 407 |
4 | xef | 162 |
5 | llama.clj | 123 |
6 | collider | 117 |
7 | shady.ai | 106 |
8 | ImpAI | 38 |
9 | llama.py | 28 |
10 | llm_client | 18 |
11 | llama-server-playground | 2 |
Sponsored