gccrs
axolotl
Our great sponsors
gccrs | axolotl | |
---|---|---|
102 | 29 | |
2,255 | 5,641 | |
1.6% | 23.6% | |
10.0 | 9.8 | |
5 days ago | 5 days ago | |
Python | ||
GNU General Public License v3.0 only | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gccrs
-
FreeBSD evaluating Rust's adoption into base system
There is a Rust front-end for GCC that is under active development [1]. If the chip vendors are not willing to develop and upstream a LLVM back-end then they can feel free to start contributing to it.
[1] https://rust-gcc.github.io/
-
Why do lifetimes need to be leaky?
That's why gccrs doesn't even consider lifetime checking a part of the language (they plan to use Polonius, too).
- Rust-GCC: GCC Front-End for Rust
-
How hard would it be to port the Rust toolchain to a new non-POSIX OS written in Rust and get it to host its own development? What would that process entail?
There's ongoing work on a Rust front-end for GCC (https://github.com/Rust-GCC/gccrs). Bit barebones right now -- ie, even core doesn't compile -- but there's funding, demand, and regular progress, so it'll only get better from there. Once gccrs can compile core, it should be ready to compile most of Rust, and thus if you've taught the calling conventions for C to GCC, you're golden.
-
How hard is it to write a front end for a more complex language like Rust or Kotlin?
I recommend checking out the GCC Rust frontend project.
-
Rust contributions for Linux 6.4 are finally merged upstream!
That is what theyre refering to, yes. The GitHub is named https://github.com/Rust-GCC/gccrs
-
GCC 13 and the State of Gccrs
- But this misses so much extra context information
3. Macro invocations there are really subtle rules on how you treat macro invocations such as this which is not documented at all https://github.com/Rust-GCC/gccrs/blob/master/gcc/rust/expan...
Some day I personally want to write a blog post about how complicated and under spec'd Rust is, then write one about the stuff i do like it such as iterators being part of libcore so i don't need reactive extensions.
- Break rust Easter Egg Merged Into gccrs
-
Any alternate Rust compilers?
(Speaking of which, Rust-GCC (or gcc-rs or gccrs or whichever other of their names they decide is the primary one) isn't even going to be a complete C++ implementation. Their plan is to implement enough to compile Polonius (the NLL 2.0 borrow checker being developed in Rust for rustc) and then share that since borrow-checking isn't necessary for codegen... only to identify and reject invalid programs... making the C++ portion of it not that different in scope from mrustc.)
-
Which programming languages, if all legacy code written in them was ported to a more modern language, would become extinct?
That bridge will be crossed with gccrs (compiling Rust with gcc directly, coming next month with GCC 13) and rust_codegen_gcc (rustc frontend, GCC backend, works now but just doesn’t yet have an “easy” setup)
axolotl
-
Ask HN: Most efficient way to fine-tune an LLM in 2024?
The approach I see used is axolotl with QLoRA using cloud GPUs which can be quite cheap.
https://github.com/OpenAccess-AI-Collective/axolotl
- FLaNK AI - 01 April 2024
-
LoRA from Scratch implementation for LLM finetuning
https://github.com/OpenAccess-AI-Collective/axolotl
- Optimized Triton Kernels for full fine tunes
- Axolotl
-
Let’s Collaborate to Build a High-Quality, Open-Source Dataset for LLMs!
One option is to look at what Axolotl uses. They have a list of different dataset formats that they support. They're mostly in JSON with specific field names, so you could start putting a dataset together with a text editor or a JSON editor.
- Axolotl: Streamline fine-tuning of AI models
-
Dataset Creation Tools?
You can save that overall set into a json file and load it up as training data in whatever you're using. I'm using axolotl for it at the moment. Though a GUI based option is probably best for the first couple of tries until you get a feel for the options.
-
Progress on Reproducing Phi-1/1.5
Looking forward to the results! If it turns out the dataset is reproducible, then it might be a good candidate for ReLora training on axolotl!
What are some alternatives?
gcc-rust - a (WIP) Rust frontend for gcc / a gcc backend for rustc
gpt-llm-trainer
rustc_codegen_gcc - libgccjit AOT codegen for rustc
signal-cli - signal-cli provides an unofficial commandline, JSON-RPC and dbus interface for the Signal messenger.
rustc_codegen_gcc - libgccjit AOT codegen for rustc
LoRA - Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
mold - Mold: A Modern Linker 🦠
mlc-llm - Enable everyone to develop, optimize and deploy AI models natively on everyone's devices.
rust - Empowering everyone to build reliable and efficient software.
LMFlow - An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
Rust-for-Linux - Adding support for the Rust language to the Linux kernel.
koboldcpp - A simple one-file way to run various GGML and GGUF models with KoboldAI's UI