[D] An ELI5 explanation for LoRA - Low-Rank Adaptation.

This page summarizes the projects mentioned and recommended in the original post on /r/MachineLearning

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • alpaca-lora

    Instruct-tune LLaMA on consumer hardware

  • Repos like https://github.com/tloen/alpaca-lora and https://github.com/Lightning-AI/lit-llama use LoRA as a method to fine-tune LLaMA models.

  • lora

    Using Low-rank adaptation to quickly fine-tune diffusion models. (by cloneofsimo)

  • Recently, I have seen the LoRA technique (Low-Rank Adaptation of Large Language Models) as a popular method for fine-tuning LLMs and other models.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • lit-llama

    Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.

  • Repos like https://github.com/tloen/alpaca-lora and https://github.com/Lightning-AI/lit-llama use LoRA as a method to fine-tune LLaMA models.

  • japanese-alpaca-lora

    A japanese finetuned instruction LLaMA

  • Hey! I am actually trying that task, with a little success. Trying to adapt MPT by mosaicML for Japanese. Someone has done similar things with LLama model (https://github.com/masa3141/japanese-alpaca-lora) . But I want to try with MPT model, however as you stated, the performance is quite not satisfactory. But the LLama model trained on Japanese alpaca seems to work fine!

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts