llm
dioxus
llm | dioxus | |
---|---|---|
41 | 155 | |
5,954 | 18,613 | |
3.1% | 11.9% | |
9.4 | 9.9 | |
about 2 months ago | 2 days ago | |
Rust | Rust | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
llm
-
Open-sourcing a simple automation/agent workflow builder
We're open-sourcing a project that lets you build simple automations/agent workflows that use LLMs for different tasks. Kinda like Zapier or IFTTT but focused on using natural language to accomplish your tasks.It's super early but we'd love to start getting feedback to steer it in the right direction. It currently supports OpenAI and local models through llm.
-
Meta's Segment Anything written with C++ / GGML
> Tensorflow is a C++ framework that has Python bindings and a Python library, but when the models are served they are running on C++
Sure, and it's only a simple 20 step process that involves building Tensorflow from source. Yeay!
https://medium.com/@hamedmp/exporting-trained-tensorflow-mod...
Let me see what the process for compiling a LLM written in Rust is....
https://github.com/rustformers/llm
cargo install llm-cli
-
Announcing Floneum (A open source graph editor for local AI workflows written in rust)
Floneum is a graph editor for local AI workflows. It uses llm to run large language models locally, egui, and dioxus for the frontend, and wasmtime for the plugin system. If you are interested in the project, consider joining the discord, or building a plugin for Floneum in rust using WASI
- are there anytools or frameworks similar to "langchain" or "llamaindexbut implemented or designed in a language other than python?
-
(1/2) May 2023
Run inference for Large Language Models on CPU, with Rust (https://github.com/rustformers/llm)
-
I built a multi-platform desktop app to easily download and run models, open source btw
On the rustformers github page I see that one of the commands to generate the answer is llm llama infer -m ggml-gpt4all-j-v1.3-groovy.bin -p "Rust is a cool programming language because", my basic idea for now is to change the Tauri app to let it do -p prompt, which receives from my code through the link or through a shared variable (if I don't use the link and start different times your app)
- Weekly Megathread - 14 May 2023
-
rustformers/llm: Run inference for Large Language Models on CPU, with Rust 🦀🚀🦙
wonnx has done some fantastic work in this regard, so that's where we plan to start once we get there. In terms of general discussion of alternate backends, see this issue.
- llm: a Rust crate/CLI for CPU inference of LLMs, including LLaMA, GPT-NeoX, GPT-J and more
dioxus
-
Dioxus 0.5: Web, Desktop, Mobile Apps in Rust
We have a web components example here: https://github.com/DioxusLabs/dioxus/blob/fd21c971038840130f...
Everything should work like normal except: attributes are not typed, custom event listeners must be implemented with web-sys
- Container2wasm: Convert Containers to WASM Blobs
-
Why Are Tech Reporters Sleeping on the Biggest App Store Story?
I think something like https://dioxuslabs.com could deliver native, cross platform apps and win back mobile.
-
Using Dioxus with Rust to build performant single-page apps
While we took an in-depth look at Dioxus in this tutorial, there is still so much to learn. Luckily, Dioxus provides detailed documentation with references and cookbooks to guide developers. Make sure you check it out, and feel free to comment below with any questions.
-
Package All the Things
You can probably imagine the challenges of integrating such a system in a robust way that does a good job and improves on the status quo. We felt like the Tauri implementation worked well for Tauri apps. But there’s a problem: it was so tightly coupled to Tauri that the work couldn’t be enjoyed by other projects (not even the ones like Dioxus who were using Tauri’s underlying technology of Tao + Wry).
- Show HN: Play Euchre with AI Bots
- Dioxus: Fullstack GUI library for desktop, web, mobile, and more
- Projects to contribute to?
- Ask HN: React Native or Flutter for a new app in 2023?
-
Announcing Floneum (A open source graph editor for local AI workflows written in rust)
Floneum is a graph editor for local AI workflows. It uses llm to run large language models locally, egui, and dioxus for the frontend, and wasmtime for the plugin system. If you are interested in the project, consider joining the discord, or building a plugin for Floneum in rust using WASI
What are some alternatives?
llama.cpp - LLM inference in C/C++
tauri - Build smaller, faster, and more secure desktop applications with a web frontend.
ggml - Tensor library for machine learning
yew - Rust / Wasm framework for creating reliable and efficient web applications
GPTQ-for-LLaMa - 4 bits quantization of LLaMA using GPTQ
leptos - Build fast web applications with Rust.
alpaca-lora - Instruct-tune LLaMA on consumer hardware
sycamore - A library for creating reactive web apps in Rust and WebAssembly
alpaca.cpp - Locally run an Instruction-Tuned Chat-Style LLM
iced - A cross-platform GUI library for Rust, inspired by Elm
SD-CN-Animation - This script allows to automate video stylization task using StableDiffusion and ControlNet.
Flutter - Flutter makes it easy and fast to build beautiful apps for mobile and beyond