pytorch-seq2seq
Deep-Learning-Papers-Reading-Roadmap
pytorch-seq2seq | Deep-Learning-Papers-Reading-Roadmap | |
---|---|---|
3 | 5 | |
5,169 | 37,120 | |
- | - | |
5.4 | 0.0 | |
3 months ago | over 1 year ago | |
Jupyter Notebook | Python | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pytorch-seq2seq
-
A Good Github Repo to Look at (CS388 Natural Language Processing)
I don't know how many people are taking CS388 NLP in Fall 2022, but the assignment is really putting lots of stress on me. I was searching some good materials to prepare for NLP class, and a really good resource to look at is this github repo: https://github.com/bentrevett/pytorch-seq2seq.
- [D] How to truly understand attention mechanism in transformers?
- [D] Resources for Understanding The Original Transformer Paper
Deep-Learning-Papers-Reading-Roadmap
-
[D] Resources for Understanding The Original Transformer Paper
https://github.com/floodsung/Deep-Learning-Papers-Reading-Roadmap - This one is a bit dated so it doesn’t contain all of the papers that you need to read to get up to date but I think you should definitely read all of the papers in this list and implement as much as you can.
-
4 ML Roadmaps to Help You Find Useful Resources to Learn From
Deep Learning Papers Reading Roadmap
-
Should I implement every famous DL paper? [D]
I found a really great list of introductory and popular dl papers (github.com/floodsung/Deep-Learning-Papers-Reading-Roadmap) and I would absolutely implement every paper on this list if I had the time (at least a mini version e.g. CIFAR10 instead of ImageNet). Is is essential for me to implement every single paper on that list to become a good DL researcher and to start reading/implementing more recent ones? All the papers on the list are from before 2017 and I can't wait to start exploring the latest research! Would I be able to get away with just implementing a handful of papers from that list?
-
[D] How did you implement papers with models that required a lot of GPUs to train?
I'm self-learning ML and trying to implement the papers listed here but I don't have access to hundreds of free GPUs like those corpos do.
-
Looking for Beginner CV Resources
Definitely check out this list https://github.com/floodsung/Deep-Learning-Papers-Reading-Roadmap It's all papers, you should get used to reading scientific material.
What are some alternatives?
Time-Series-Forecasting-Using-LSTM - Time-Series Forecasting on Stock Prices using LSTM
faceswap - Deepfakes Software For All
tensor2tensor - Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
Real-Time-Voice-Cloning - Clone a voice in 5 seconds to generate arbitrary speech in real-time
poolformer - PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)
Keras - Deep Learning for humans
Behavior-Sequence-Transformer-Pytorch - This is a pytorch implementation for the BST model from Alibaba https://arxiv.org/pdf/1905.06874.pdf
ru-dalle - Generate images from texts. In Russian
Seq2seq-PyTorch
sequitur - Library of autoencoders for sequential data
seq2seq - Attention-based sequence to sequence learning