|almost 2 years ago||almost 2 years ago|
|Jupyter Notebook||Jupyter Notebook|
|Apache License 2.0||Apache License 2.0|
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
We haven't tracked posts mentioning Fast-Transformer yet.
Tracking mentions began in Dec 2020.
What are some alternatives?
reformer-pytorch - Reformer, the efficient Transformer, in Pytorch
Perceiver - Implementation of Perceiver, General Perception with Iterative Attention in TensorFlow
poolformer - PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)
Conformer - An implementation of Conformer: Convolution-augmented Transformer for Speech Recognition, a Transformer Variant in TensorFlow/Keras
LongNet - Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
AvatarGAN - Generate Cartoon Images using Generative Adversarial Network
TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
machine-learning-experiments - 🤖 Interactive Machine Learning experiments: 🏋️models training + 🎨models demo
swarms - Build, Deploy, and Scale Reliable Swarms of Autonomous Agents. Join our Community: https://discord.gg/DbjBMJTSWD
embedding-encoder - Scikit-Learn compatible transformer that turns categorical variables into dense entity embeddings.
ML-Workspace - 🛠 All-in-one web-based IDE specialized for machine learning and data science.
planckforth - Bootstrapping a Forth interpreter from hand-written tiny ELF binary. Just for fun.