text-to-text-transfer-tra
text-to-text-transfer-transformer
text-to-text-transfer-tra | text-to-text-transfer-transformer | |
---|---|---|
2 | 29 | |
- | 5,925 | |
- | 1.4% | |
- | 5.0 | |
- | 4 months ago | |
Python | ||
- | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
text-to-text-transfer-tra
- AlphaCode by DeepMind
-
New text-to-image network from Google beats DALL-E
T5 was open-sourced on release (up to 11B params): https://github.com/google-research/text-to-text-transfer-tra...
It is also available via Hugging Face transformers.
Unclear if 11B is the T5-XXL mentioned in the paper, however.
text-to-text-transfer-transformer
- T5: Text-to-Text-Transfer-Transformer
-
Gemma: New Open Models
Google released the T5 paper about 5 years ago:
https://arxiv.org/abs/1910.10683
This included full model weights along with a detailed description of the dataset, training process, and ablations that led them to that architecture. T5 was state-of-the-art on many benchmarks when it was released, but it was of course quickly eclipsed by GPT-3.
Following GPT-3, it became much more common for labs to not release full details or model weights. Prior to that, it was common practice from Google (BERT, T5), Meta (BART), OpenAI (GPT1, GPT2) and others to release full training details and model weights.
-
[P] Free and Fast LLM Finetuning
[2] - https://arxiv.org/abs/1910.10683
- Free and Fast LLM Finetuning
-
[Discussion] Is there a better way than positional encodings in self attention?
T5-style relative encodings https://arxiv.org/abs/1910.10683
-
What were the 40 research papers on the list Ilya Sutskever gave John Carmack?
11. T5: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer" (2020) - https://arxiv.org/abs/1910.10683 (Google Research)
-
[P] T5 Implementation in PyTorch
You can find a link to the paper here: https://arxiv.org/abs/1910.10683
-
Text-to-Text Transformer (T5-Base Model) Testing For Summarization, Sentiment Classification, and Translation Using Pytorch and Torchtext
The Text-to-Text Transformer is a type of neural network architecture that is particularly well-suited for natural language processing tasks involving the generation of text. It was introduced in the paper "Attention is All You Need" by Vaswani et al. and has since become a popular choice for many NLP tasks, including language translation, summarization, and text generation
- AlphaCode by DeepMind
-
[R] LiBai: a large-scale open-source model training toolbox
Found relevant code at https://github.com/google-research/text-to-text-transfer-transformer + all code implementations here
What are some alternatives?
tortoise-tts - A multi-voice TTS system trained with an emphasis on quality
fairseq - Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
imagen-pytorch - Implementation of Imagen, Google's Text-to-Image Neural Network, in Pytorch
majesty-diffusion - Majesty Diffusion by @Dango233(@Dango233max) and @apolinario (@multimodalart)
DeepCreamPy - Decensoring Hentai with Deep Neural Networks
DeepCreamPy - deeppomf's DeepCreamPy + some updates
dalle-mini - DALLĀ·E Mini - Generate images from a text prompt
hent-AI - Automation of censor bar detection
latent-diffusion - High-Resolution Image Synthesis with Latent Diffusion Models
DALLE-pytorch - Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch