Our great sponsors
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
Hey forks. We just release a complete open-source solution for accelerating Stable Diffusion pretraining and fine-tuning. It help reduce the pretraining cost by 6.5 times, and the hardware cost of fine-tuning by 7 times, while simultaneously speeding up the processes.
Open source address: https://github.com/hpcaitech/ColossalAI/tree/main/examples/images/diffusion
Our codebase for the diffusion models builds heavily on OpenAI's ADM codebase , lucidrains, Stable Diffusion, Lightning and Hugging Face. Thanks for open-sourcing!
We also write a blog post about it. https://medium.com/@yangyou_berkeley/diffusion-pretraining-and-hardware-fine-tuning-can-be-almost-7x-cheaper-85e970fe207b
Glad to know your thoughts about our work!
Related posts
- Making large AI models cheaper, faster and more accessible
- ColossalChat: An Open-Source Solution for Cloning ChatGPT with a RLHF Pipeline
- Meet ColossalChat: An Open-Source AI Solution For Cloning ChatGPT With A Complete RLHF Pipeline
- A top AI researcher reportedly left Google for OpenAI after sharing concerns the company was training Bard on ChatGPT data
- Colossal-AI: open-source RLHF pipeline based on LLaMA pre-trained model