llama-server-chat-terminal-client
Lightweight chat terminal-interface for llama.cpp server compilable for windows and linux. (by hwpoison)
terminal-llm
Simple LLM interface based on terminal. (by raddka)
llama-server-chat-terminal-client | terminal-llm | |
---|---|---|
1 | 3 | |
10 | 9 | |
- | - | |
6.8 | 8.3 | |
3 months ago | 5 months ago | |
C++ | Python | |
- | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
llama-server-chat-terminal-client
Posts with mentions or reviews of llama-server-chat-terminal-client.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-12-05.
terminal-llm
Posts with mentions or reviews of terminal-llm.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-12-05.
-
Terminal-LLM: Now with Transformers support for non-quant models.
I've added support for native models by using transformers loader. You can check terminal-llm at https://github.com/raddka/terminal-llm.
-
Terminal client chat for llama.cpp server.
I am working on something similar that is Python based at https://github.com/raddka/terminal-llm.
-
Terminal-LLM: Lightweight simple Python-based LLM inference in terminal.
Check it out at https://github.com/raddka/terminal-llm