attention-is-all-you-need-pytorch VS transformer-pytorch

Compare attention-is-all-you-need-pytorch vs transformer-pytorch and see what are their differences.

attention-is-all-you-need-pytorch

A PyTorch implementation of the Transformer model in "Attention is All You Need". (by jadore801120)

transformer-pytorch

Transformer: PyTorch Implementation of "Attention Is All You Need" (by hyunwoongko)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
attention-is-all-you-need-pytorch transformer-pytorch
3 2
8,409 2,106
- -
0.0 2.1
7 months ago 2 days ago
Python Python
MIT License -
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

attention-is-all-you-need-pytorch

Posts with mentions or reviews of attention-is-all-you-need-pytorch. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-10-10.
  • ElevenLabs Launches Voice Translation Tool to Break Down Language Barriers
    2 projects | news.ycombinator.com | 10 Oct 2023
    The transformer model was invented to attend to context over the entire sequence length. Look at how the original authors used the Transformer for NMT in the original Vaswani et al publication. https://github.com/jadore801120/attention-is-all-you-need-py...
  • Question: LLMs
    1 project | /r/learnmachinelearning | 6 Jul 2023
    I did implement an "LLM" proof of concept from scratch in a course for my masters, pretty much doing a small implementation of a transformer from the Attention is all you Need paper (plus other resources). It was useless, but was a great experience to understand how it works. There are a few implementation like this out there, like this one: https://github.com/jadore801120/attention-is-all-you-need-pytorch (first google result). I think it is a fun exercise (the amount of fun depends on how much of a masochist you are :) ).
  • Lack of activation in transformer feedforward layer?
    2 projects | /r/learnmachinelearning | 20 May 2021
    I'm curious as to why the second matrix multiplication is not followed by an activation unlike the first one. Is there any particular reason why a non-linearity would be trivial or even avoided in the second operation? For reference, variations of this can be witnessed in a number of different implementations, including BERT-pytorch and attention-is-all-you-need-pytorch.

transformer-pytorch

Posts with mentions or reviews of transformer-pytorch. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-01-03.

What are some alternatives?

When comparing attention-is-all-you-need-pytorch and transformer-pytorch you can also consider the following projects:

LFattNet - Attention-based View Selection Networks for Light-field Disparity Estimation

transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

long-range-arena - Long Range Arena for Benchmarking Efficient Transformers

LaTeX-OCR - pix2tex: Using a ViT to convert images of equations into LaTeX code.

BERT-pytorch - Google AI 2018 BERT pytorch implementation

bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)

OpenPrompt - An Open-Source Framework for Prompt-Learning.

allennlp - An open-source NLP research library, built on PyTorch.

how_attentive_are_gats - Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)

minGPT - A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training