EfficientFormer
EfficientFormerV2 [ICCV 2023] & EfficientFormer [NeurIPs 2022] (by snap-research)
dytox
Dynamic Token Expansion with Continual Transformers, accepted at CVPR 2022 (by arthurdouillard)
EfficientFormer | dytox | |
---|---|---|
2 | 1 | |
944 | 132 | |
0.7% | - | |
3.3 | 1.8 | |
9 months ago | almost 2 years ago | |
Python | Python | |
GNU General Public License v3.0 or later | Apache License 2.0 |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
EfficientFormer
Posts with mentions or reviews of EfficientFormer.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-09-16.
-
A look at Apple’s new Transformer-powered predictive text model
I'm pretty fatigued on constantly providing references and sources in this thread but an example of what they've made availably publicly:
https://github.com/snap-research/EfficientFormer
-
Snap and Northeastern University Researchers Propose EfficientFormer: A Vision Transformer That Runs As Fast As MobileNet While Maintaining High Performance
Continue reading | Check out the paper, github
dytox
Posts with mentions or reviews of dytox.
We have used some of these posts to build our list of alternatives
and similar projects.
-
[D] Using special tokens for a domain-specific language in transformers
Code for https://arxiv.org/abs/2111.11326 found: https://github.com/arthurdouillard/dytox
What are some alternatives?
When comparing EfficientFormer and dytox you can also consider the following projects:
PyTorch-Model-Compare - Compare neural networks by their feature similarity
CeiT - Implementation of Convolutional enhanced image Transformer
Efficient-AI-Backbones - Efficient AI Backbones including GhostNet, TNT and MLP, developed by Huawei Noah's Ark Lab.
ml-cvnets - CVNets: A library for training computer vision networks
predictive-spy - Spying on Apple’s new predictive text model
llama.cpp - LLM inference in C/C++
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.