chaz
chatty-llama
chaz | chatty-llama | |
---|---|---|
1 | 1 | |
23 | 28 | |
- | - | |
8.4 | 7.2 | |
12 days ago | 9 months ago | |
Rust | Rust | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
chaz
-
Claude AI launches on iOS (Android coming soon)
At the risk of being spammy, I wrote a simple Matrix bot that replicates the entirety of their currently announced featureset, but also supports any other model too, that you can access from a normal Matrix client.
Stop re-building chat clients, I already have one!
Ideally they would just run their own chatbot on different existing chat platforms that you could verify your API key with, but with my project you can at least run that chatbot yourself.
[0] - https://github.com/arcuru/chaz
[1] - https://jackson.dev/post/chaz/ (Blog Post)
chatty-llama
-
Chatty LLama: A fullstack Rust + react chat app using Meta's Llama-2 LLMs https://github.com/Sollimann/chatty-llama
Link to repo: https://github.com/Sollimann/chatty-llama FYI, I'm using Rust for model hosting and inference, React for chat app and Caddy as web server. Inference currently runs purely on CPU, but with the option of running on GPU.
What are some alternatives?
floneum - Instant, controllable, local pre-trained AI models in Rust
tenere - 🔥 TUI interface for LLMs written in Rust
llm-chain - `llm-chain` is a powerful rust crate for building chains in large language models allowing you to summarise text and complete complex tasks
fullstack-rust - Reference implementation of a full-stack Rust application
smartgpt - A program that provides LLMs with the ability to complete complex tasks using plugins.
fireside-chat - An LLM interface (chat bot) implemented in pure Rust using HuggingFace/Candle over Axum Websockets, an SQLite Database, and a Leptos (Wasm) frontend packaged with Tauri!