OpenPrompt
attention-is-all-you-need-pytorch
Our great sponsors
OpenPrompt | attention-is-all-you-need-pytorch | |
---|---|---|
1 | 3 | |
4,146 | 8,432 | |
2.0% | - | |
4.4 | 0.0 | |
3 months ago | 10 days ago | |
Python | Python | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
OpenPrompt
attention-is-all-you-need-pytorch
-
ElevenLabs Launches Voice Translation Tool to Break Down Language Barriers
The transformer model was invented to attend to context over the entire sequence length. Look at how the original authors used the Transformer for NMT in the original Vaswani et al publication. https://github.com/jadore801120/attention-is-all-you-need-py...
-
Question: LLMs
I did implement an "LLM" proof of concept from scratch in a course for my masters, pretty much doing a small implementation of a transformer from the Attention is all you Need paper (plus other resources). It was useless, but was a great experience to understand how it works. There are a few implementation like this out there, like this one: https://github.com/jadore801120/attention-is-all-you-need-pytorch (first google result). I think it is a fun exercise (the amount of fun depends on how much of a masochist you are :) ).
-
Lack of activation in transformer feedforward layer?
I'm curious as to why the second matrix multiplication is not followed by an activation unlike the first one. Is there any particular reason why a non-linearity would be trivial or even avoided in the second operation? For reference, variations of this can be witnessed in a number of different implementations, including BERT-pytorch and attention-is-all-you-need-pytorch.
What are some alternatives?
autonlp - 🤗 AutoNLP: train state-of-the-art natural language processing models and deploy them in a scalable environment automatically
LFattNet - Attention-based View Selection Networks for Light-field Disparity Estimation
clip-as-service - 🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
long-range-arena - Long Range Arena for Benchmarking Efficient Transformers
camel_tools - A suite of Arabic natural language processing tools developed by the CAMeL Lab at New York University Abu Dhabi.
BERT-pytorch - Google AI 2018 BERT pytorch implementation
nlp-recipes - Natural Language Processing Best Practices & Examples
allennlp - An open-source NLP research library, built on PyTorch.
thinc - 🔮 A refreshing functional take on deep learning, compatible with your favorite libraries
transformer-pytorch - Transformer: PyTorch Implementation of "Attention Is All You Need"
chappie.ai - Generalized AI to perform a multitude of tasks written in python3
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.