Serving multiple LoRA finetuned LLM as one
Why do you think that https://github.com/bupticybee/FastLoRAChat is a good alternative to punica
Serving multiple LoRA finetuned LLM as one
Why do you think that https://github.com/bupticybee/FastLoRAChat is a good alternative to punica