Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR. Learn more →
Top 23 Python Lora Projects
-
Implementation: ORPO has been integrated into popular fine-tuning libraries like TRL, Axolotl, and LLaMA-Factory.
-
CodeRabbit
CodeRabbit: AI Code Reviews for Developers. Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR.
-
Hi,
Yes you can. The community creates quantized variants of these that can run on consumer GPUs. A 4-bit quantization of LLAMA 70b works pretty well on Macbook pros, the neural engine with unified CPU memory is quite solid for these. GPUs is a bit tougher because consumer GPU RAM is still kinda small.
You can also fine-tune them. There are lot of frameworks like unsloth that make this easier. https://github.com/unslothai/unsloth . Fine-tuning can be pretty tricky to get right, you need to be aware of things like learning rates, but there are good resources on the internet where a lot of hobbyists have gotten things working. You do not need a PhD in ML to accomplish this. You will, however, need data that you can represent textually.
Source: Director of Engineering for model serving at Databricks.
-
-
-
mcdse-2b is trained from MrLight/dse-qwen2-2b-mrl-v1 using low-rank adapters (LoRA) on a multilingual corpus of documents. I have trained it on 8xRTX3090 using the DSE approach with the following parameters:
-
Project mention: Ask HN: AI/ML papers to catch up with current state of AI? | news.ycombinator.com | 2023-12-15
LongAlpaca / One of many ways to extend context, and a useful dataset / https://arxiv.org/abs/2309.12307
-
xTuring
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
Project mention: LoRAX: Hot swap LoRA adapters to serve many finetuned models concurrently | news.ycombinator.com | 2024-02-01
-
Reticulum
The cryptography-based networking stack for building unstoppable networks with LoRa, Packet Radio, WiFi and everything in between.
Project mention: A Simple open-source Phone programmable with Arduino | news.ycombinator.com | 2024-10-19And meybe integrated sound cable for baofeng/quansheng (K1<->USBA) USBA as powerbank and Packet radio or http://reticulum.network
-
Project mention: Now You Can Full Fine Tune / DreamBooth Stable Diffusion XL (SDXL) with only 10.3 GB VRAM via OneTrainer | dev.to | 2024-03-25
Used SG161222/RealVisXL_V4.0 as a base model and OneTrainer to train on Windows 10 : https://github.com/Nerogar/OneTrainer
-
Project mention: Nomad, communicate off-grid mesh, forward secrecy and extreme privacy | news.ycombinator.com | 2024-08-15
-
-
-
Project mention: Show HN: Toolkit for LLM Fine-Tuning, Ablating and Testing | news.ycombinator.com | 2024-04-07
-
Lora-for-Diffusers
The most easy-to-understand tutorial for using LoRA (Low-Rank Adaptation) within diffusers framework for AI Generation Researchers🔥
-
DoRA
[ICML2024 (Oral)] Official PyTorch implementation of DoRA: Weight-Decomposed Low-Rank Adaptation (by NVlabs)
-
LLaMA-LoRA-Tuner
UI tool for fine-tuning and testing your own LoRA models base on LLaMA, GPT-J and more. One-click run on Google Colab. + A Gradio ChatGPT-like Chat UI to demonstrate your language models.
-
Sideband
LXMF client for Android, Linux and macOS allowing you to communicate with people or LXMF-compatible systems over Reticulum networks using LoRa, Packet Radio, WiFi, I2P, or anything else Reticulum supports.
Project mention: Nomad, communicate off-grid mesh, forward secrecy and extreme privacy | news.ycombinator.com | 2024-08-15Reticulum is incredibly versatile and has an entire ecosystem of tools under development. NomadNet is just one of the messengers. There is Sideband, a mobile app client (https://github.com/markqvist/Sideband), and Reticulum MeshChat, developed by Liam Cottle which is a browser based client https://github.com/liamcottle/reticulum-meshchat.
Reticulum can work over anything that has a throughput greater than 5 bits a second (yes, bits) and a MDU of 500 bytes. Not only can it work over hundreds of different carriers but each of these carriers can be apart of the same network.
I threw together a quick proof of concept of it working over HF radio. I setup two nodes about 144 km (90 miles) separate. Both were ICOM-7300's with a Raspberry Pi 5 driving the software modem that would take packets from Reticulum and send them over the air. https://www.youtube.com/watch?v=blwNVumLujc
Node 1 was out in the field while Node 2 was back at my house. Node 2 had two interfaces setup, one for the HF modem and another connected to the TCP testnet. This means that Node 1 could access any peer that was over on the TCP testnet.
Here is a quick primer on Reticulum that explains some of the basic concepts: https://www.youtube.com/watch?v=q8ltLt5SK6A
-
-
-
-
-
LLaMA-8bit-LoRA
Repository for Chat LLaMA - training a LoRA for the LLaMA (1 or 2) models on HuggingFace with 8-bit or 4-bit quantization. Research only.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Python Lora discussion
Python Lora related posts
-
A Simple open-source Phone programmable with Arduino
-
Reticulum Is Unstoppable Networks for the People
-
Private, Secure and Uncensorable Messaging over a LoRa Mesh
-
Nomad, communicate off-grid mesh, forward secrecy and extreme privacy
-
Reticulum Network Stack β – cryptography-based networking stack
-
Now You Can Full Fine Tune / DreamBooth Stable Diffusion XL (SDXL) with only 10.3 GB VRAM via OneTrainer
-
You can now train a 70B language model at home
-
A note from our sponsor - CodeRabbit
coderabbit.ai | 10 Dec 2024
Index
What are some of the best open-source Lora projects in Python? This list will help you:
Project | Stars | |
---|---|---|
1 | LLaMA-Factory | 35,732 |
2 | unsloth | 18,874 |
3 | Chinese-LLaMA-Alpaca | 18,466 |
4 | peft | 16,614 |
5 | LoRA | 10,890 |
6 | LongLoRA | 2,645 |
7 | xTuring | 2,618 |
8 | lorax | 2,235 |
9 | Reticulum | 2,116 |
10 | OneTrainer | 1,826 |
11 | NomadNet | 1,238 |
12 | aphrodite-engine | 1,159 |
13 | punica | 995 |
14 | LLM-Finetuning-Toolkit | 786 |
15 | Lora-for-Diffusers | 772 |
16 | DoRA | 658 |
17 | LLaMA-LoRA-Tuner | 448 |
18 | Sideband | 392 |
19 | BentoDiffusion | 340 |
20 | mLoRA | 280 |
21 | RNode_Firmware | 193 |
22 | kohya-sd-scripts-webui | 164 |
23 | LLaMA-8bit-LoRA | 147 |