JPEGDEC
candle
JPEGDEC | candle | |
---|---|---|
5 | 17 | |
347 | 13,766 | |
- | 6.4% | |
7.8 | 9.9 | |
about 2 months ago | 3 days ago | |
C | Rust | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
JPEGDEC
- karpathy/llm.c
-
Family Photos vs 256 Kb RAM
It takes some time to decode, but I can grab- for example- this 259k JPEG: http://placekitten.com/3840/2160
And display it from the microSD card with this code: https://gist.github.com/Gadgetoid/0b8e352e377135d743338c9483...
As more or less demonstrated by this example: https://github.com/pimoroni/pimoroni-pico/blob/main/micropyt...
Decoding is done via Larry Bank's JPEGDEC: https://github.com/bitbank2/JPEGDEC
It uses roughly 20k RAM to provide the necessary buffers both for decoding JPEGs into blocks which are passed to a drawing routine that handles copying the data into the display RAM buffer (on our larger Inky display this is backed by PSRAM). I don't believe there's a hard limit to the size of the JPEGs you can display. The main bottlenecks are decoding time and the relatively limited scaling options: FULL, HALF, QUARTER, EIGHTH.
This aside, the authors solution is actually quite elegant. As long as you have control over the image pipeline there's no real reason to encumber the device with handling large (both in bytes and pixel dimensions) files. You'll also get much better dithering results writing your own routine to convert files from JPG to the raw 4-bits per pixel format for the display. Our built-in dithering is just a plain ordered dither matrix and, while quaint and retro, it leaves much to be desired visually.
If you're trying to use a public API you can also make GitHub actions automate the whole image conversion process and publish the results to GitHub pages. This works great for, for example, the daily XKCD, serving both to reformat the strip for the display, credit the author, extract the "alt" text and avoid excess requests to the origin. Eg: https://pimoroni.github.io/feed2image/xkcd-800x480-daily.jpg. Though the astute will notice I still opted for jpeg in this case.
-
Making a tiny video player using an ESP32 programmed in Arduino. Managed to fit all of the components into a tiny space and am about to order my PCBs. Wish me luck!
I'm doing the same thing with MJPEG and the JPEGDEC library https://github.com/bitbank2/JPEGDEC. I preprocess the video using ffmpeg to scale the video down to 240x240 and remove the compression.
candle
-
karpathy/llm.c
Candle already exists[1], and it runs pretty well. Can use both CUDA and Metal backends (or just plain-old CPU).
[1] https://github.com/huggingface/candle
- Best alternative for python
-
Is there any LLM that can be installed with out python
Check out Candle! It's a Deep Learning framework for Rust. You can run LLMs in binaries.
-
Announcing Kalosm - an local first AI meta-framework for Rust
Kalosm is a meta-framework for AI written in Rust using candle. Kalosm supports local quantized large language models like Llama, Mistral, Phi-1.5, and Zephyr. It also supports other quantized models like Wuerstchen, Segment Anything, and Whisper. In addition to local models, Kalosm supports remote models like GPT-4 and ada embeddings.
-
RFC: candle-lora
I have been working on a machine learning library called candle-lora for Candle. It implementes a technique called LoRA (low rank adaptation), which allows you to reduce a model's trainable parameter count by wrapping and freezing old layers.
-
ExecuTorch: Enabling On-Device interference for embedded devices
[2] https://github.com/huggingface/candle/issues/313
-
[P] Open-source project to run locally LLMs in browser, such as Phi-1.5 for fully private inference
We provide full local inference in browser, by using libraries from Hugging Face like transformers.js or candle for WASM inference.
-
Update on the Candle ML framework.
We've first announced Candle, a minimalist ML framework in Rust 6 weeks ago. Since then we've focused on adding various recent models and improved the framework so as to support the necessary features in an efficient way. You can checkout a gallery of the examples, supported models include:
-
Should I Haskell or OCaml?
How did you select those two as your options?
I'm just a hobbyist that enjoys programming, and I eventually wanted to expand beyond python. I looked at Haskell and read Learn You a Haskell and did some Exercism exercises but never got anywhere close to being able to use it for real projects. Have been trying to learn about Lisp lately and feel like I've come to a similar dead end.
On the other hand, both Go and Rust have felt fulfilling and practical, with static typing and solid tooling, cross compilations, static binaries, and dependency management that is just a huge breath of fresh air coming from python.
The ML / data science scene is nowhere near as developed as in Python, and I still lean on jupyter/polars/PyTorch here, but I think the candle project[0] seems very interesting. Compiling whisper down to a single CUDA-leveraging binary for fast local transcription is pretty cool!
[0]: https://github.com/huggingface/candle
- Minimalist ML framework for Rust
What are some alternatives?
ReflectionsOS - Reflections is a hardware and software platform for building entertaining mobile experiences.
Universal-G-Code-Sender - A cross-platform G-Code sender for GRBL, Smoothieware, TinyG and G2core.
pimoroni-pico - Libraries and examples to support Pimoroni Pico add-ons in C++ and MicroPython.
burn - Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals. [Moved to: https://github.com/Tracel-AI/burn]
OneBitDisplay - A full featured Arduino display library for 1-bit per pixel OLED, LCD and e-paper displays
tch-rs - Rust bindings for the C++ api of PyTorch.
picojpeg - picojpeg: Tiny JPEG decoder for 8/16-bit microcontrollers
bCNC - GRBL CNC command sender, autoleveler and g-code editor
cncjs - A web-based interface for CNC milling controller running Grbl, Marlin, Smoothieware, or TinyG.
gsender - Connect to and control Grbl-based CNCs with ease
cncjs-kt-ext - Auto-leveling extension for CNCjs
FluidNC - The next generation of motion control firmware