tinygrad
docs
tinygrad | docs | |
---|---|---|
17 | 235 | |
24,138 | 1,714 | |
3.8% | 0.0% | |
10.0 | 0.0 | |
3 days ago | about 2 years ago | |
Python | ||
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
tinygrad
-
AMD Unveils Ryzen 8000G Series Processors: Zen 4 APUs for Desktop with Ryzen AI
Not sure if I completely understand what "Ryzen AI" does, but Tinygrad for example has some limited support for RDNA3[0]. It isn't quite there yet in matters of performance though, as you can read in the comments of that file.
There's also a small tutorial by AMD on how to use the WMMA intrinsic[1] using AMD's hipcc[2] compiler. Documentation is sparse kinda sparse, but the instruction set is not huge. The RDNA3 ISA guide[3] might also be helpful (and only a fraction of the pages are relevant.)
0. https://github.com/tinygrad/tinygrad/blob/master/extra/gemm/...
1. https://gpuopen.com/learn/wmma_on_rdna3/
2. https://github.com/ROCm/HIPCC
3. https://www.amd.com/content/dam/amd/en/documents/radeon-tech...
- Tinygrad 0.8.0 Release
-
Beyond Backpropagation - Higher Order, Forward and Reverse-mode Automatic Differentiation for Tensorken
This post describes how I added automatic differentiation to Tensorken. Tensorken is my attempt to build a fully featured yet easy-to-understand and hackable implementation of a deep learning library in Rust. It takes inspiration from the likes of PyTorch, Tinygrad, and JAX.
-
[D] What is a good way to maintain code readability and code quality while scaling up complexity in libraries like Hugging Face?
what do you think about tinygrad? I think its a good example of growing and well written, (partially) well documented library with many close to reference implementations
-
AMD MI300 Performance – Faster Than H100, but How Much?
The idea of model architecture making fast hardware design easier is what makes https://github.com/tinygrad/tinygrad so interesting.
-
💻 7 Open-Source DevTools That Save Time You Didn't Know to Exist ⌛🚀
🌟 Support on GitHub Website: https://tinygrad.org/
- Tinygrad
-
How to train an Iris dataset classifier with Tinygrad
Before we begin, make sure you have TinyGrad and the required dependencies installed. You can find the installation instructions here.
-
Decomposing Language Models into Understandable Components
Try to get something like tinygrad[1] running locally, that way you can tweak things a bit run it again and see how it performs. While doing this you'll pick up most of the concepts and get a feeling of how things work. Also, take a look at projects like llama.cpp[2], you don't have to fully understand what's going on here, tho.
You may need some intermediate knowledge of linear algebra and this thing called "data science" nowadays, which is pretty much knowing how to mangle data and visualize it.
Try creating a small model on your own, it doesn't have to be super fancy just make sure it does something you want it to do. And then ... you'll probably could go on your own then.
1: https://github.com/tinygrad/tinygrad
2: https://github.com/ggerganov/llama.cpp
- Tinygrad 0.7.0
docs
-
A Brief History of the U.S. Trying to Add Backdoors into Encrypted Data
marcan of the Asahi Linux project got into a discussion on reddit about this, and says that when it comes to hardware, you just can’t know.
> I can't prove the absence of a silicon backdoor on any machine, but I can say that given everything we know about AS systems (and we know quite a bit), there is no known place a significant backdoor could hide that could completely compromise my system. And there are several such places on pretty much every x86 system
(Long) thread starts here, show hidden comments for the full discussion https://old.reddit.com/r/AsahiLinux/comments/13voeey/what_is...
I highly recommend reading this if you’re interested https://github.com/AsahiLinux/docs/wiki/Introduction-to-Appl...
-
The Register looks at the first release of Fedora Asahi Remix
Depends on the box. In general if there is a hardwired HDMI port it works, if it's an alt mode it doesn't yet. The feature pages give detail by hardware, heres a direct link to the M2 page https://github.com/AsahiLinux/docs/wiki/M2-Series-Feature-Su...
-
Fedora Asahi Remix
https://github.com/AsahiLinux/docs/wiki/M1-Series-Feature-Su...
According to this page it should work on M1 MBP, but there is also a note about a specific patch released next week.
-
Sonoma updates bricking MBPs
I'm just refuting that OP's dot update problem on Sonoma was caused by the refresh rate bug. In all likelihood OP doesn't have a weird Sonoma/Ventura dual boot situation going on (or Ashai Linux for that matter, who wrote a great article about this). In all my testing (and with a large enterprise sample size) we had zero reports of the refresh bug impacting an Apple Silicon Mac running just Sonoma itself.
- Speaker Support in Asahi Linux
-
Tuxedo Pulse Gen 3
> They don't support variations of software at all. They support the hardware. [...] Asahi does not need to support applications at all.
From their FAQ page[1]:
> We will eventually release a remix of Arch Linux ARM, packaged for installation by end-users, as a distribution of the same name. The majority of the work resides in hardware support, drivers, and tools, and it will be upstreamed to the relevant projects. The distribution will be a convenient package for easy installation by end-users and give them access to bleeding-edge versions of the software we develop.
As distro maintainers, it is their job to make sure the applications they package work on the hardware they support. This includes submitting patches upstream when that is not the case, as application maintainers likely wouldn't want to support such a niche environment directly. So, yes, they rely on volunteers to fix issues, but they will likely have to support many applications themselves.
There is still a lot of broken software, as this list[2] is surely not exhaustive.
> Same deal for any other hardware manufacturer. [...] Really not much different to other hardware manufacturers since Linux started.
No, it's very different. First of all, the amount of Linux hackers who volunteered to reverse engineer the wide variety of hardware was orders of magnitude larger than the Asahi team. Even if they limit the amount of devices they support, modern computers are far more complex than in the early days of Linux. Regardless of how talented the Asahi team is, maintaining all the hardware of a modern computer is a sisyphean task for a project run by volunteers.
Secondly, hardware manufacturers could see the benefit of getting their hardware to run in Linux, and many eventually took over support from volunteers. Apple has shown no interest in doing so, and has historically been hostile to open source.
> Asahi devs have made it clear that Apple has chosen to avoid blocking installation of other operating systems.
The fact they allow installation of other operating systems today, doesn't mean that this decision couldn't change in the future. Services are a large part of their business, and allowing a group of hackers to use their hardware without being part of their software ecosystem may seem like a non-issue today, but if this group grows larger assuming projects like Asahi are successful, this might become a considerable loss of income which wouldn't be in their best interest.
> Apple has no issue with it.
Can you point me to an official ackgnowledgment of Asahi Linux by Apple? Or any indication that leaving this door open was a sign of good will, instead of a lack of interest in closing it? What makes you think they wouldn't eventually lock down Macbooks in the same way they do iPhones and iPads?
> ARM is a stable well supported platform for Linux
It's really not. A lot of software works, but when it doesn't, the user is SOL. As you can see on their Broken Software page[2], the major issue is precisely with AArch64 support. This should improve eventually, and Asahi is certainly a torchbearer in this scenario, but today it's yet another hurdle of using Apple hardware.
[1]: https://asahilinux.org/about/#is-this-a-linux-distribution
[2]: https://github.com/AsahiLinux/docs/wiki/Broken-Software
- Asahi Linux Team Uncovers macOS Refresh Rate Bugs: Sonoma Boot Failures
-
Update on the Sonoma bug situation
More information about the macOS Sonoma ProMotion bug here.
-
PSA: Don't upgrade to Ventura 13.6+ or Sonoma 14.0+ on Apple Silicon with custom display settings
Here’s the actual issue for anyone that cares, fully documented : https://github.com/AsahiLinux/docs/wiki/macOS-Sonoma-Boot-Failures
What are some alternatives?
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
idevicerestore - Restore/upgrade firmware of iOS devices
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
tinygrad - You like pytorch? You like micrograd? You love tinygrad! ❤️ [Moved to: https://github.com/tinygrad/tinygrad]
llama.cpp - LLM inference in C/C++
FEX - A fast usermode x86 and x86-64 emulator for Arm64 Linux
llama - Inference code for Llama models
asahi-installer - Asahi Linux installer
openpilot - openpilot is an open source driver assistance system. openpilot performs the functions of Automated Lane Centering and Adaptive Cruise Control for 250+ supported car makes and models.
AsahiLinux
tensorflow_macos - TensorFlow for macOS 11.0+ accelerated using Apple's ML Compute framework.
nixos-apple-silicon - Resources to install NixOS bare metal on Apple Silicon Macs