machine-learning-experiments
Fast-Transformer
machine-learning-experiments | Fast-Transformer | |
---|---|---|
8 | 4 | |
1,608 | 146 | |
- | - | |
2.6 | 0.0 | |
4 months ago | about 2 years ago | |
Jupyter Notebook | Jupyter Notebook | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
machine-learning-experiments
Fast-Transformer
What are some alternatives?
Torrent-To-Google-Drive-Downloader-v3 - Simple notebook to stream torrent files to Google Drive using Google Colab and python3.
reformer-pytorch - Reformer, the efficient Transformer, in Pytorch
osumapper - An automatic beatmap generator using Tensorflow / Deep Learning.
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
lama - 🦙 LaMa Image Inpainting, Resolution-robust Large Mask Inpainting with Fourier Convolutions, WACV 2022
Conformer - An implementation of Conformer: Convolution-augmented Transformer for Speech Recognition, a Transformer Variant in TensorFlow/Keras
PConv-Keras - Unofficial implementation of "Image Inpainting for Irregular Holes Using Partial Convolutions". Try at: www.fixmyphoto.ai
TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
lucid - A collection of infrastructure and tools for research in neural network interpretability.
Transformer-in-Transformer - An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Hands-On-Meta-Learning-With-Python - Learning to Learn using One-Shot Learning, MAML, Reptile, Meta-SGD and more with Tensorflow
embedding-encoder - Scikit-Learn compatible transformer that turns categorical variables into dense entity embeddings.