adapters
LLM-Adapters
adapters | LLM-Adapters | |
---|---|---|
4 | 2 | |
2,398 | 950 | |
1.9% | 2.2% | |
8.6 | 7.3 | |
4 days ago | about 2 months ago | |
Jupyter Notebook | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
adapters
-
[D] NLP question: does fine-tuning train input embedding?
Usually in computer vision resnets, people finetune only the last layers, but in NLP you tune the entire model. There are also plenty of instances where people try to not do this, such as in adapters, however.
-
[P] AdapterHub v2: Lightweight Transfer Learning with Transformers and Adapters
GitHub: https://github.com/Adapter-Hub/adapter-transformers
-
Our new state-of-the-art multilingual NLP Toolkit - Trankit has been released
Thanks for the question. The main libraries that Trankit's using are pytorch and adapter-transformers. For the GPU requirement, we have tested our toolkit on different scenarios and found that a single GPU with 4GB of memory would be enough for a comfortable use.
LLM-Adapters
-
Google DeepMind CEO Says Some Form of AGI Possible in a Few Years
That is not true, you can for example use an additional adapter to optimize, that takes 50$ and a 1 hour. https://github.com/AGI-Edgerunners/LLM-Adapters
- LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of LLMs
What are some alternatives?
haystack - :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
TencentPretrain - Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
clip-as-service - 🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
discus - A data-centric AI package for ML/AI. Get the best high-quality data for the best results. Discord: https://discord.gg/t6ADqBKrdZ
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
custom-diffusion - Custom Diffusion: Multi-Concept Customization of Text-to-Image Diffusion (CVPR 2023)
JointBERT - Pytorch implementation of JointBERT: "BERT for Joint Intent Classification and Slot Filling"
hierarchical-domain-adaptation - Code of NAACL 2022 "Efficient Hierarchical Domain Adaptation for Pretrained Language Models" paper.
bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
AGIEval
trankit - Trankit is a Light-Weight Transformer-based Python Toolkit for Multilingual Natural Language Processing
LLM-Finetuning-Hub - Toolkit for fine-tuning, ablating and unit-testing open-source LLMs. [Moved to: https://github.com/georgian-io/LLM-Finetuning-Toolkit]