ColossalAI
lightning
Our great sponsors
ColossalAI | lightning | |
---|---|---|
41 | 50 | |
37,775 | 2,754 | |
3.5% | 0.7% | |
9.7 | 9.9 | |
2 days ago | about 12 hours ago | |
Python | C | |
Apache License 2.0 | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ColossalAI
-
Open source solution replicates ChatGPT training process
The article talks about their RLHF implementation briefly. There’s details on their RLHF implementation here: https://github.com/hpcaitech/ColossalAI/blob/a619a190df71ea3...
-
An Open-Source Version of ChatGPT is Coming [News]
Need to deploy the inference model with Colossal AI.
-
Training dreambooth/embeddings on an RTX 3060 - possible?
It’s a framework for alot of pipeline parallelism optimizations that can allow you to not have to fit the whole model in vram. https://www.hpc-ai.tech/blog/diffusion-pretraining-and-hardware-fine-tuning-can-be-almost-7x-cheaper Tutorial here: https://github.com/hpcaitech/ColossalAI/blob/main/examples/images/dreambooth/README.md I have amd cards so I haven’t tried this yet but am thinking of converting my amd gpu server over to nvidia bc of this
-
A complete open-source solution for accelerating Stable Diffusion
Hey forks. We just release a complete open-source solution for accelerating Stable Diffusion pretraining and fine-tuning. It help reduce the pretraining cost by 6.5 times, and the hardware cost of fine-tuning by 7 times, while simultaneously speeding up the processes.
Open source address: https://github.com/hpcaitech/ColossalAI/tree/main/examples/images/diffusion
Our codebase for the diffusion models builds heavily on OpenAI's ADM codebase , lucidrains, Stable Diffusion, Lightning and Hugging Face. Thanks for open-sourcing!
We also write a blog post about it. https://medium.com/@yangyou_berkeley/diffusion-pretraining-and-hardware-fine-tuning-can-be-almost-7x-cheaper-85e970fe207b
Glad to know your thoughts about our work!
Just to make the links clickable:
https://github.com/hpcaitech/ColossalAI/tree/main/examples/i...
https://medium.com/@yangyou_berkeley/diffusion-pretraining-a...
-
We just release a complete open-source solution for accelerating Stable Diffusion pretraining and fine-tuning!
Open source address: https://github.com/hpcaitech/ColossalAI/tree/main/examples/images/diffusion
- Colossal-AI releases a complete open-source Stable Diffusion pretraining and fine-tuning solution that reduces the pretraining cost by 6.5 times, and the hardware cost of fine-tuning by 7 times, while simultaneously speeding up the processes
-
Colossal-AI Seamlessly Accelerates Large Models at Low Costs with Hugging Face
Portal Project address: https://github.com/hpcaitech/ColossalAI Reference https://arxiv.org/abs/2202.05924v2 https://arxiv.org/abs/2205.11487 https://github.com/features/copilot https://github.com/huggingface/transformers https://www.forbes.com/sites/forbestechcouncil/2022/03/25/six-ai-trends-to-watch-in-2022/?sh=4dc51f82be15 https://www.infoq.com/news/2022/06/meta-opt-175b/
-
The 10 Trending Python Repositories on GitHub (May 2022)
ColossalAI
lightning
-
Serious Question about Bitcoin
c-lightning, developed by [Blockstream]
-
Run a node
LND - https://github.com/lightningnetwork/lnd CLN - https://github.com/ElementsProject/lightning
-
📑 MiniBolt resources 📚 List of the MiniBolt core/bonus guides + latest versions
Core Lightning (CLN) v.22.11 (Released 30th November 2022) - https://github.com/ElementsProject/lightning/releases
-
How to use the lightning network without apps.
A country can't stop you from running lnd or Core Lightning, so I'm not sure what you're talking about?
-
⚡ RaspiBolt Improvement Proposals (RBIPs) & Bounties 💰
Power cuts can lead to node corruption (e.g. here)
- Ask HN: What's the best source code you've read?
-
Core-lightning finally makes a step into the right direction
Source: https://github.com/ElementsProject/lightning/releases
-
An Amboss Space Odyssey | Thomas Jestopher
• Core Lightning 5078: https://github.com/ElementsProject/lightning/issues/5078
-
Building c-lightning plugin with .NET
Recently, I have been modifying my own Lapps to work as a c-lightning plugin in order to make them work with not only with LND but also with c-lightning (What are Lapps?).
-
Best node Option for non raspberry Pi
git clone c-lightning from github, compile it, install it (also read https://github.com/ElementsProject/lightning )
What are some alternatives?
DeepSpeed - DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Megatron-LM - Ongoing research training transformer models at scale
lnd - Lightning Network Daemon ⚡️
determined - Determined is an open-source machine learning platform that simplifies distributed training, hyperparameter tuning, experiment tracking, and resource management. Works with PyTorch and TensorFlow.
Eclair - A scala implementation of the Lightning Network.
fairscale - PyTorch extensions for high performance and large scale training.
umbrel - A beautiful home server OS for self-hosting with an app store. Buy a pre-built Umbrel Home with umbrelOS, or install on a Raspberry Pi 4, Pi 5, any Ubuntu/Debian system, or a VPS.
DeepFaceLive - Real-time face swap for PC streaming or video calls
PaddlePaddle - PArallel Distributed Deep LEarning: Machine Learning Framework from Industrial Practice (『飞桨』核心框架,深度学习&机器学习高性能单机、分布式训练和跨平台部署)
ivy - The Unified AI Framework
DeepFaceLab - DeepFaceLab is the leading software for creating deepfakes.
zeus - A mobile Bitcoin wallet fit for the gods. ⚡️ Est. 563345