text-to-text-transfer-transformer
ml-engineering
text-to-text-transfer-transformer | ml-engineering | |
---|---|---|
29 | 9 | |
5,909 | 9,753 | |
1.1% | - | |
5.0 | 9.7 | |
3 months ago | 11 days ago | |
Python | Python | |
Apache License 2.0 | Creative Commons Attribution Share Alike 4.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
text-to-text-transfer-transformer
- T5: Text-to-Text-Transfer-Transformer
-
Gemma: New Open Models
Google released the T5 paper about 5 years ago:
https://arxiv.org/abs/1910.10683
This included full model weights along with a detailed description of the dataset, training process, and ablations that led them to that architecture. T5 was state-of-the-art on many benchmarks when it was released, but it was of course quickly eclipsed by GPT-3.
Following GPT-3, it became much more common for labs to not release full details or model weights. Prior to that, it was common practice from Google (BERT, T5), Meta (BART), OpenAI (GPT1, GPT2) and others to release full training details and model weights.
-
[P] Free and Fast LLM Finetuning
[2] - https://arxiv.org/abs/1910.10683
- Free and Fast LLM Finetuning
-
[Discussion] Is there a better way than positional encodings in self attention?
T5-style relative encodings https://arxiv.org/abs/1910.10683
-
What were the 40 research papers on the list Ilya Sutskever gave John Carmack?
11. T5: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer" (2020) - https://arxiv.org/abs/1910.10683 (Google Research)
-
[P] T5 Implementation in PyTorch
You can find a link to the paper here: https://arxiv.org/abs/1910.10683
-
Text-to-Text Transformer (T5-Base Model) Testing For Summarization, Sentiment Classification, and Translation Using Pytorch and Torchtext
The Text-to-Text Transformer is a type of neural network architecture that is particularly well-suited for natural language processing tasks involving the generation of text. It was introduced in the paper "Attention is All You Need" by Vaswani et al. and has since become a popular choice for many NLP tasks, including language translation, summarization, and text generation
- AlphaCode by DeepMind
-
[R] LiBai: a large-scale open-source model training toolbox
Found relevant code at https://github.com/google-research/text-to-text-transfer-transformer + all code implementations here
ml-engineering
- Accelerators
-
Gemma: New Open Models
There is a lot of work to make the actual infrastructure and lower level management of lots and lots of GPUs/TPUs open as well - my team focuses on making the infrastructure bit at least a bit more approachable on GKE and Kubernetes.
https://github.com/GoogleCloudPlatform/ai-on-gke/tree/main
and
https://github.com/google/xpk (a bit more focused on HPC, but includes AI)
and
https://github.com/stas00/ml-engineering (not associated with GKE, but describes training with SLURM)
The actual training is still a bit of a small pool of very experienced people, but it's getting better. And every day serving models gets that much faster - you can often simply draft on Triton and TensorRT-LLM or vLLM and see significant wins month to month.
- FLaNK Stack 29 Jan 2024
-
ML Engineering Online Book
OK, the pdf is ready now: https://github.com/stas00/ml-engineering#pdf-version
-
Self train a super tiny model recommendations
this might be interesting: https://github.com/stas00/ml-engineering/blob/master/transformers/make-tiny-models.md
- The AI Battlefield Engineering – What You Need to Know
- Machine Learning Engineering Guides and Tools
What are some alternatives?
fairseq - Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
slurm-mail - Slurm-Mail is a drop in replacement for Slurm's e-mails to give users much more information about their jobs compared to the standard Slurm e-mails.
tortoise-tts - A multi-voice TTS system trained with an emphasis on quality
peft - 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
DeepCreamPy - Decensoring Hentai with Deep Neural Networks
deeplake - Database for AI. Store Vectors, Images, Texts, Videos, etc. Use with LLMs/LangChain. Store, query, version, & visualize any AI data. Stream data in real-time to PyTorch/TensorFlow. https://activeloop.ai
dalle-mini - DALL·E Mini - Generate images from a text prompt
pinferencia - Python + Inference - Model Deployment library in Python. Simplest model inference server ever.
latent-diffusion - High-Resolution Image Synthesis with Latent Diffusion Models
haystack - :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
majesty-diffusion - Majesty Diffusion by @Dango233(@Dango233max) and @apolinario (@multimodalart)
AtomGPT - 中英文预训练大模型,目标与ChatGPT的水平一致