Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today. Learn more →
Terminal-llm Alternatives
Similar projects and alternatives to terminal-llm
-
plusplus-camall
I don't know what this is or will evolve into exactly, but consider it a playground for experimenting, tinkering and hacking around the ggml library, especially llama.cpp – and primarily its server as a first class citizen. The best fruits will hopefully be merged upstream – as long as they are consistent with the philosophy of llama.cpp
-
llama-server-chat-terminal-client
Lightweight chat terminal-interface for llama.cpp server compilable for windows and linux.
-
Scout Monitoring
Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
terminal-llm reviews and mentions
-
Terminal-LLM: Now with Transformers support for non-quant models.
I've added support for native models by using transformers loader. You can check terminal-llm at https://github.com/raddka/terminal-llm.
-
Terminal client chat for llama.cpp server.
I am working on something similar that is Python based at https://github.com/raddka/terminal-llm.
-
Terminal-LLM: Lightweight simple Python-based LLM inference in terminal.
Check it out at https://github.com/raddka/terminal-llm
-
A note from our sponsor - Scout Monitoring
www.scoutapm.com | 7 Jun 2024
Stats
raddka/terminal-llm is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of terminal-llm is Python.