Our great sponsors
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
Introducing .NET Multi-platform App UI (MAUI)
.NET MAUI is the .NET Multi-platform App UI, a framework for building native device applications spanning mobile, tablet, and desktop.
-
LyCORIS
Lora beYond Conventional methods, Other Rank adaptation Implementations for Stable diffusion.
https://github.com/huggingface/peft
Microsoft has done this before with mauikit and mauilinux: https://github.com/dotnet/maui/issues/35
Unlikely that they even consider checking whether they are stomping across existing names.
For those wondering why this is interesting: This technique is being used to reproduce[0] (though unclear exactly with what fidelity) the Alpaca results from Stanford[1] with a few hours of training on consumer-grade hardware.
I believe that there will be a cottage industry of providing application-specific fine-tuned models like this, that can run in e.g. AWS very inexpensively. The barrier today seems to be that the base model (here, LLaMa) is encumbered and can't be used commercially. Someone will soon, I'm confident, release e.g. an MIT-licensed equivalent and we'll all be off to the races.
[0] https://github.com/tloen/alpaca-lora
There are some WIP evolutions of SD Lora in the works, like locon and lycoris.
https://github.com/KohakuBlueleaf/LyCORIS
Modern peft methods with LoRA actually do reduce training time by orders of magnitude.
Here's an example of 20 seconds per epoch on a single consumer GPU: https://github.com/johnsmith0031/alpaca_lora_4bit/issues/7#i...