D2L_Attention_Mechanisms_in_TF VS Transformer-in-Transformer

Compare D2L_Attention_Mechanisms_in_TF vs Transformer-in-Transformer and see what are their differences.

D2L_Attention_Mechanisms_in_TF

This repository contains Tensorflow 2 code for Attention Mechanisms chapter of Dive into Deep Learning (D2L) book. (by biswajitsahoo1111)

Transformer-in-Transformer

An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches (by Rishit-dagli)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
D2L_Attention_Mechanisms_in_TF Transformer-in-Transformer
7 4
12 41
- -
0.0 0.0
over 2 years ago over 2 years ago
Jupyter Notebook Jupyter Notebook
- Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Transformer-in-Transformer

Posts with mentions or reviews of Transformer-in-Transformer. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-12-06.

What are some alternatives?

When comparing D2L_Attention_Mechanisms_in_TF and Transformer-in-Transformer you can also consider the following projects:

NLP-With-PyTorch - My NLP experiments using PyTorch to solve some common NLP problems with advanced and state of the art deep learning techniques.

poolformer - PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)

TokenCut - (CVPR 2022) Pytorch implementation of "Self-supervised transformers for unsupervised object discovery using normalized cut"

LongNet - Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"

pytorch-GAT - My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!

AvatarGAN - Generate Cartoon Images using Generative Adversarial Network

gpt-mini - Yet another minimalistic Tensorflow (re-)re-implementation of Karpathy's Pytorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer).

swarms - Orchestrate Swarms of Agents From Any Framework Like OpenAI, Langchain, and Etc for Real World Workflow Automation. Join our Community: https://discord.gg/DbjBMJTSWD

principia - The Principia Rewrite

planckforth - Bootstrapping a Forth interpreter from hand-written tiny ELF binary. Just for fun.

Fast-Transformer - An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow

ML-Workspace - 🛠 All-in-one web-based IDE specialized for machine learning and data science.