deep-diamond
llm
deep-diamond | llm | |
---|---|---|
16 | 41 | |
412 | 5,911 | |
-0.5% | 2.4% | |
7.6 | 9.4 | |
2 months ago | about 1 month ago | |
Clojure | Rust | |
Eclipse Public License 1.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
deep-diamond
-
LLaMA-rs: Run inference of LLaMA on CPU with Rust 🦀🦙
I had some "classical ML" knowledge and knew a bit about the math behind DL and tensors in general thanks to the book Deep Learning for Programmers showcased in this repo: https://github.com/uncomplicate/deep-diamond (it's not in Rust, and I'm not sure what the current state of it is, though!).
- Interactive Programming for Artificial Intelligence Book Series
-
Neanderthal, Deep Diamond, and ClojureCUDA now support the latest CUDA 11.7 GPU computing platform.
Plase check out clojars, or https://neanderthal.uncomplicate.org/ https://clojurecuda.uncomplicate.org/ https://github.com/uncomplicate/deep-diamond
- Uncomplicate releases with better CUDA compatibility (Deep Diamond, Neanderthal, ClojureCUDA)
- Deep Diamond 0.22.0 released
- What are some book suggestions you have for someone that is interested in learning about neural networks?
- Reading the Deep Learning Book by Goodfellow
- Arkadaşlar biraz programlama temelim var, makine öğrenmesine girmek istiyorum, ücretsiz kaynak önerebilir misiniz?
- Some Maths Resources to Help You in Your ML Journey
-
I want to quit my data analyst job and learn and become a Clojure developer
Do clojure as a side gig or in free time. Let day job pay the bills. If you can, maybe incorporate clojure into work job to solve small problems (https://github.com/clj-python/libpython-clj and https://github.com/scicloj/clojisr provide bridges to/from python and r). There is a lot of effort going into the data science side as well; the scicloj effort has resulted in a lot of growth over the last 2 years. tech.ml.dataset, tech.ml (now scicloj.ml). Dragan has a bunch of excellent stuff in neanderthal and deep diamond. There are also bindings to other jvm libraries from multiple languages.
llm
-
Open-sourcing a simple automation/agent workflow builder
We're open-sourcing a project that lets you build simple automations/agent workflows that use LLMs for different tasks. Kinda like Zapier or IFTTT but focused on using natural language to accomplish your tasks.It's super early but we'd love to start getting feedback to steer it in the right direction. It currently supports OpenAI and local models through llm.
-
Meta's Segment Anything written with C++ / GGML
> Tensorflow is a C++ framework that has Python bindings and a Python library, but when the models are served they are running on C++
Sure, and it's only a simple 20 step process that involves building Tensorflow from source. Yeay!
https://medium.com/@hamedmp/exporting-trained-tensorflow-mod...
Let me see what the process for compiling a LLM written in Rust is....
https://github.com/rustformers/llm
cargo install llm-cli
-
Announcing Floneum (A open source graph editor for local AI workflows written in rust)
Floneum is a graph editor for local AI workflows. It uses llm to run large language models locally, egui, and dioxus for the frontend, and wasmtime for the plugin system. If you are interested in the project, consider joining the discord, or building a plugin for Floneum in rust using WASI
- are there anytools or frameworks similar to "langchain" or "llamaindexbut implemented or designed in a language other than python?
-
(1/2) May 2023
Run inference for Large Language Models on CPU, with Rust (https://github.com/rustformers/llm)
-
I built a multi-platform desktop app to easily download and run models, open source btw
On the rustformers github page I see that one of the commands to generate the answer is llm llama infer -m ggml-gpt4all-j-v1.3-groovy.bin -p "Rust is a cool programming language because", my basic idea for now is to change the Tauri app to let it do -p prompt, which receives from my code through the link or through a shared variable (if I don't use the link and start different times your app)
- Weekly Megathread - 14 May 2023
-
rustformers/llm: Run inference for Large Language Models on CPU, with Rust 🦀🚀🦙
wonnx has done some fantastic work in this regard, so that's where we plan to start once we get there. In terms of general discussion of alternate backends, see this issue.
- llm: a Rust crate/CLI for CPU inference of LLMs, including LLaMA, GPT-NeoX, GPT-J and more
What are some alternatives?
tech.ml.dataset - A Clojure high performance data processing system
llama.cpp - LLM inference in C/C++
neanderthal - Fast Clojure Matrix Library
ggml - Tensor library for machine learning
compare_gan - Compare GAN code.
GPTQ-for-LLaMa - 4 bits quantization of LLaMA using GPTQ
clojisr - Clojure speaks statistics - a bridge between Clojure to R
alpaca-lora - Instruct-tune LLaMA on consumer hardware
mmaction2 - OpenMMLab's Next Generation Video Understanding Toolbox and Benchmark
alpaca.cpp - Locally run an Instruction-Tuned Chat-Style LLM
scicloj.ml - A Clojure machine learning library
SD-CN-Animation - This script allows to automate video stylization task using StableDiffusion and ControlNet.