LLM-Adapters
adapters
LLM-Adapters | adapters | |
---|---|---|
2 | 4 | |
960 | 2,414 | |
4.1% | 2.6% | |
7.3 | 8.6 | |
2 months ago | 3 days ago | |
Python | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
LLM-Adapters
-
Google DeepMind CEO Says Some Form of AGI Possible in a Few Years
That is not true, you can for example use an additional adapter to optimize, that takes 50$ and a 1 hour. https://github.com/AGI-Edgerunners/LLM-Adapters
- LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of LLMs
adapters
-
[D] NLP question: does fine-tuning train input embedding?
Usually in computer vision resnets, people finetune only the last layers, but in NLP you tune the entire model. There are also plenty of instances where people try to not do this, such as in adapters, however.
-
[P] AdapterHub v2: Lightweight Transfer Learning with Transformers and Adapters
GitHub: https://github.com/Adapter-Hub/adapter-transformers
-
Our new state-of-the-art multilingual NLP Toolkit - Trankit has been released
Thanks for the question. The main libraries that Trankit's using are pytorch and adapter-transformers. For the GPU requirement, we have tested our toolkit on different scenarios and found that a single GPU with 4GB of memory would be enough for a comfortable use.
What are some alternatives?
TencentPretrain - Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
haystack - :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
discus - A data-centric AI package for ML/AI. Get the best high-quality data for the best results. Discord: https://discord.gg/t6ADqBKrdZ
clip-as-service - 🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
custom-diffusion - Custom Diffusion: Multi-Concept Customization of Text-to-Image Diffusion (CVPR 2023)
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
hierarchical-domain-adaptation - Code of NAACL 2022 "Efficient Hierarchical Domain Adaptation for Pretrained Language Models" paper.
JointBERT - Pytorch implementation of JointBERT: "BERT for Joint Intent Classification and Slot Filling"
AGIEval
bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
LLM-Finetuning-Hub - Toolkit for fine-tuning, ablating and unit-testing open-source LLMs. [Moved to: https://github.com/georgian-io/LLM-Finetuning-Toolkit]
trankit - Trankit is a Light-Weight Transformer-based Python Toolkit for Multilingual Natural Language Processing