simpleT5
TencentPretrain
simpleT5 | TencentPretrain | |
---|---|---|
2 | 1 | |
381 | 981 | |
- | 0.9% | |
2.5 | 7.6 | |
12 months ago | 9 days ago | |
Python | Python | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
simpleT5
-
Transformers: How to compare performance to base model?
Currently I just took ~42000 samples and trained a translation task directly on codeT5 with https://github.com/Shivanandroy/simpleT5. Validation loss and at least the qualitative results are not to bad. Im now going to try to compare it to the base codeT5-model with the *.loss function as suggested above.
-
[P] SimpleT5 : Train T5 models in just 3 lines of code
🌟GitHub: https://github.com/Shivanandroy/simpleT5 🌟Medium: https://snrspeaks.medium.com/simplet5-train-t5-models-in-just-3-lines-of-code-by-shivanand-roy-2021-354df5ae46ba 🌟Colab Notebook: https://colab.research.google.com/drive/1JZ8v9L0w0Ai3WbibTeuvYlytn0uHMP6O?usp=sharing
TencentPretrain
What are some alternatives?
reformer-pytorch - Reformer, the efficient Transformer, in Pytorch
tiger - Open Source LLM toolkit to build trustworthy LLM applications. TigerArmor (AI safety), TigerRAG (embedding, RAG), TigerTune (fine-tuning)
datatap-python - Focus on Algorithm Design, Not on Data Wrangling
alpaca-lora - Instruct-tune LLaMA on consumer hardware
ModelZoo.pytorch - Hands on Imagenet training. Unofficial ModelZoo project on Pytorch. MobileNetV3 Top1 75.64🌟 GhostNet1.3x 75.78🌟
LLM-Adapters - Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
frame-semantic-transformer - Frame Semantic Parser based on T5 and FrameNet
llama-classification - Text classification with Foundation Language Model LLaMA
KeyPhraseTransformer - KeyPhraseTransformer lets you quickly extract key phrases, topics, themes from your text data with T5 transformer | Keyphrase extraction | Keyword extraction
stanford_alpaca - Code and documentation to train Stanford's Alpaca models, and generate the data.