hierarchical-domain-adaptation VS LLM-Adapters

Compare hierarchical-domain-adaptation vs LLM-Adapters and see what are their differences.

hierarchical-domain-adaptation

Code of NAACL 2022 "Efficient Hierarchical Domain Adaptation for Pretrained Language Models" paper. (by alexandra-chron)

LLM-Adapters

Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models" (by AGI-Edgerunners)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
hierarchical-domain-adaptation LLM-Adapters
1 2
32 950
- 3.1%
3.0 7.3
8 months ago 2 months ago
Python Python
- Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

hierarchical-domain-adaptation

Posts with mentions or reviews of hierarchical-domain-adaptation. We have used some of these posts to build our list of alternatives and similar projects.

LLM-Adapters

Posts with mentions or reviews of LLM-Adapters. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-05-03.

What are some alternatives?

When comparing hierarchical-domain-adaptation and LLM-Adapters you can also consider the following projects:

pytorch-adapt - Domain adaptation made easy. Fully featured, modular, and customizable.

TencentPretrain - Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo

pykale - Knowledge-Aware machine LEarning (KALE): accessible machine learning from multiple sources for interdisciplinary research, part of the 🔥PyTorch ecosystem. ⭐ Star to support our work!

discus - A data-centric AI package for ML/AI. Get the best high-quality data for the best results. Discord: https://discord.gg/t6ADqBKrdZ

custom-diffusion - Custom Diffusion: Multi-Concept Customization of Text-to-Image Diffusion (CVPR 2023)

AGIEval

adapters - A Unified Library for Parameter-Efficient and Modular Transfer Learning

LLM-Finetuning-Hub - Toolkit for fine-tuning, ablating and unit-testing open-source LLMs. [Moved to: https://github.com/georgian-io/LLM-Finetuning-Toolkit]

trankit - Trankit is a Light-Weight Transformer-based Python Toolkit for Multilingual Natural Language Processing

finetuner - :dart: Task-oriented embedding tuning for BERT, CLIP, etc.

VL_adapter - PyTorch code for "VL-Adapter: Parameter-Efficient Transfer Learning for Vision-and-Language Tasks" (CVPR2022)

xTuring - Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6