noah-research
Noah Research (by huawei-noah)
transformer-in-transformer
Implementation of Transformer in Transformer, pixel level attention paired with patch level attention for image classification, in Pytorch (by lucidrains)
noah-research | transformer-in-transformer | |
---|---|---|
2 | 1 | |
835 | 291 | |
0.7% | - | |
0.0 | 1.8 | |
5 months ago | over 2 years ago | |
Python | Python | |
- | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
noah-research
Posts with mentions or reviews of noah-research.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2021-03-04.
transformer-in-transformer
Posts with mentions or reviews of transformer-in-transformer.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2021-03-04.
-
“Transformer in Transformer” paper explained!
A thirdparty implementation of " Transformer in Transformer": https://github.com/lucidrains/transformer-in-transformer
What are some alternatives?
When comparing noah-research and transformer-in-transformer you can also consider the following projects:
performer-pytorch - An implementation of Performer, a linear attention-based transformer, in Pytorch