language-planner
Fast-Transformer
language-planner | Fast-Transformer | |
---|---|---|
1 | 4 | |
227 | 146 | |
- | - | |
0.0 | 0.0 | |
about 2 years ago | over 2 years ago | |
Jupyter Notebook | Jupyter Notebook | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
language-planner
Fast-Transformer
What are some alternatives?
gpt-3-simple-tutorial - Generate SQL from Natural Language Sentences using OpenAI's GPT-3 Model
reformer-pytorch - Reformer, the efficient Transformer, in Pytorch
augmented-interpretable-models - Interpretable and efficient predictors using pre-trained language models. Scikit-learn compatible.
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
alpaca_eval - An automatic evaluator for instruction-following language models. Human-validated, high-quality, cheap, and fast.
Conformer - An implementation of Conformer: Convolution-augmented Transformer for Speech Recognition, a Transformer Variant in TensorFlow/Keras
tldr-transformers - The "tl;dr" on a few notable transformer papers (pre-2022).
machine-learning-experiments - 🤖 Interactive Machine Learning experiments: 🏋️models training + 🎨models demo
BLOOM-fine-tuning - Finetune BLOOM
TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
mgpt - Multilingual Generative Pretrained Model
Transformer-in-Transformer - An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches