Fast-Transformer
language-planner
Fast-Transformer | language-planner | |
---|---|---|
4 | 1 | |
146 | 213 | |
- | - | |
0.0 | 0.0 | |
about 2 years ago | almost 2 years ago | |
Jupyter Notebook | Jupyter Notebook | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Fast-Transformer
language-planner
What are some alternatives?
reformer-pytorch - Reformer, the efficient Transformer, in Pytorch
gpt-3-simple-tutorial - Generate SQL from Natural Language Sentences using OpenAI's GPT-3 Model
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
augmented-interpretable-models - Interpretable and efficient predictors using pre-trained language models. Scikit-learn compatible.
Conformer - An implementation of Conformer: Convolution-augmented Transformer for Speech Recognition, a Transformer Variant in TensorFlow/Keras
alpaca_eval - An automatic evaluator for instruction-following language models. Human-validated, high-quality, cheap, and fast.
machine-learning-experiments - 🤖 Interactive Machine Learning experiments: 🏋️models training + 🎨models demo
tldr-transformers - The "tl;dr" on a few notable transformer papers (pre-2022).
TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
BLOOM-fine-tuning - Finetune BLOOM
Transformer-in-Transformer - An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
mgpt - Multilingual Generative Pretrained Model