Finetune Falcon, LLaMA, MPT, and RedPajama on consumer hardware using PEFT LoRA
Why do you think that https://github.com/bupticybee/FastLoRAChat is a good alternative to lora-instruct
Finetune Falcon, LLaMA, MPT, and RedPajama on consumer hardware using PEFT LoRA
Why do you think that https://github.com/bupticybee/FastLoRAChat is a good alternative to lora-instruct