snorkel
pytorch-lightning
Our great sponsors
snorkel | pytorch-lightning | |
---|---|---|
5 | 8 | |
5,685 | 26,611 | |
0.8% | 3.0% | |
5.5 | 9.9 | |
30 days ago | about 18 hours ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
snorkel
-
[P] We are building a curated list of open source tooling for data-centric AI workflows, looking for contributions.
The paid product came out of an open source tool: https://github.com/snorkel-team/snorkel
- [Discussion] - "data sourcing will be more important than model building in the era of foundational model fine-tuning"
-
Can't use load_data from utils
Actually, I referenced it in my issue as well. There seems to be different utils.py file in different folders under the snorkel-tutorials repo but the utils file we get after importing snorkel has a different [file](https://github.com/snorkel-team/snorkel/blob/master/snorkel/utils/core.py) ,i.e. the utils file is different in the main snorkel repo
- [D] A hand-picked selection of the best Python ML Libraries of 2021
pytorch-lightning
-
Como empezar con inteligencia artificial?
https://see.stanford.edu/Course/CS229 https://lightning.ai/ https://www.youtube.com/watch?v=00s9ireCnCw&t=57s https://towardsdatascience.com/
-
Best practice for saving logits/activation values of model in PyTorch Lightning
I've been wondering on what is the recommended method of saving logits/activations using PyTorch Lightning. I've looked at Callbacks, Loggers and ModelHooks but none of the use-cases seem to be for this kind of activity (even if I were to create my own custom variants of each utility). The ModelCheckpoint Callback in its utility makes me feel like custom Callbacks would be the way to go but I'm not quite sure. This closed GitHub issue does address my issue to some extent.
-
We just release a complete open-source solution for accelerating Stable Diffusion pretraining and fine-tuning!
Our codebase for the diffusion models builds heavily on OpenAI's ADM codebase , lucidrains, Stable Diffusion, Lightning and Hugging Face. Thanks for open-sourcing!
-
An elegant and strong PyTorch Trainer
For lightweight use, pytorch-lightning is too heavy, and its source code will be very difficult for beginners to read, at least for me.
What are some alternatives?
lnd - Lightning Network Daemon ⚡️
skweak - skweak: A software toolkit for weak supervision applied to NLP tasks
argilla - Argilla is a collaboration platform for AI engineers and domain experts that require high-quality outputs, full data ownership, and overall efficiency.
Eclair - A scala implementation of the Lightning Network.
mmdetection - OpenMMLab Detection Toolbox and Benchmark
spaCy - 💫 Industrial-strength Natural Language Processing (NLP) in Python
composer - Supercharge Your Model Training
umbrel - A beautiful home server OS for self-hosting with an app store. Buy a pre-built Umbrel Home with umbrelOS, or install on a Raspberry Pi 4, Pi 5, any Ubuntu/Debian system, or a VPS.
weasel - Weakly Supervised End-to-End Learning (NeurIPS 2021)
Keras - Deep Learning for humans
fastai - The fastai deep learning library
RTL - Ride The Lightning - A full function web browser app for LND, C-Lightning and Eclair