Serving multiple LoRA finetuned LLM as one
Why do you think that https://github.com/hiyouga/LLaMA-Factory is a good alternative to punica
Serving multiple LoRA finetuned LLM as one
Why do you think that https://github.com/hiyouga/LLaMA-Factory is a good alternative to punica