Transformer-in-Transformer
swarms
Our great sponsors
Transformer-in-Transformer | swarms | |
---|---|---|
4 | 1 | |
41 | 650 | |
- | - | |
0.0 | 10.0 | |
about 2 years ago | 7 days ago | |
Jupyter Notebook | Python | |
Apache License 2.0 | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Transformer-in-Transformer
- I Implemented Transformer in Transformer
-
Hacker News top posts: Dec 6, 2021
I Implemented Transformer in Transformer\ (5 comments)
- [P] I implemented Transformer in Transformer
swarms
What are some alternatives?
poolformer - PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)
langchain-course - Learn to build and deploy AI apps.
LongNet - Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
LLM-Prompt-Library - Advanced Code and Text Manipulation Prompts for Various LLMs. Suitable for GPT-4, Claude, Llama2, Falcon, Bard, and other high-performing open-source LLMs.
AvatarGAN - Generate Cartoon Images using Generative Adversarial Network
langchain-tutorials - Overview and tutorial of the LangChain Library
principia - The Principia Rewrite
vision_models_playground - Playground for testing and implementing various Vision Models
planckforth - Bootstrapping a Forth interpreter from hand-written tiny ELF binary. Just for fun.
agentchain - Chain together LLMs for reasoning & orchestrate multiple large models for accomplishing complex tasks
Fast-Transformer - An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow