peft VS minLoRA

Compare peft vs minLoRA and see what are their differences.

peft

🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning. (by huggingface)

minLoRA

minLoRA: a minimal PyTorch library that allows you to apply LoRA to any PyTorch model. (by cccntu)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
peft minLoRA
26 3
13,877 388
4.1% -
9.7 2.4
4 days ago 11 months ago
Python Jupyter Notebook
Apache License 2.0 MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

peft

Posts with mentions or reviews of peft. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-05.

minLoRA

Posts with mentions or reviews of minLoRA. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-04-11.
  • [D] Is it possible to train the same LLM instance on different users' data?
    2 projects | /r/MachineLearning | 11 Apr 2023
    This repository seems to be doing it. Basically, you want to take the weights/biases that were trained during the LoRA training process and include them in the compute graph for the larger network, or remove them.
  • [P] minLoRA: An Easy-to-Use PyTorch Library for Applying LoRA to PyTorch Models
    3 projects | /r/MachineLearning | 21 Feb 2023
    Theirs requires you to rewrite the whole model and replace every layer you want to apply LoRA to to the LoRA counterpart, or use monky-patching. Mine utilizes PyTorch parametrizations to inject the LoRA logic to existing models. If your model has nn.Linear, you can call add_lora(model) to add LoRA to all the linear layers. And it's not limited to Linear, you can see how I extended it to Embedding, Conv2d in a couple lines of code. https://github.com/cccntu/minLoRA/blob/main/minlora/model.py

What are some alternatives?

When comparing peft and minLoRA you can also consider the following projects:

lora - Using Low-rank adaptation to quickly fine-tune diffusion models.

GTSRB - Convolutional Neural Network for German Traffic Sign Recognition Benchmark

LoRA - Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"

alpaca-lora - Instruct-tune LLaMA on consumer hardware

dalai - The simplest way to run LLaMA on your local machine

mlc-llm - Enable everyone to develop, optimize and deploy AI models natively on everyone's devices.

lamini

simple-llm-finetuner - Simple UI for LLM Model Finetuning

alpaca_lora_4bit

transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

SwapCudaVersionWindows - How to swap/switch CUDA versions on Windows

PorousMediaLab - PorousMediaLab - toolbox for batch and 1D reactive transport modelling