pykale VS hierarchical-domain-adaptation

Compare pykale vs hierarchical-domain-adaptation and see what are their differences.

pykale

Knowledge-Aware machine LEarning (KALE): accessible machine learning from multiple sources for interdisciplinary research, part of the 🔥PyTorch ecosystem. ⭐ Star to support our work! (by pykale)

hierarchical-domain-adaptation

Code of NAACL 2022 "Efficient Hierarchical Domain Adaptation for Pretrained Language Models" paper. (by alexandra-chron)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
pykale hierarchical-domain-adaptation
2 1
427 32
1.6% -
9.1 3.0
about 1 month ago 7 months ago
Python Python
MIT License -
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

pykale

Posts with mentions or reviews of pykale. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-04-25.

hierarchical-domain-adaptation

Posts with mentions or reviews of hierarchical-domain-adaptation. We have used some of these posts to build our list of alternatives and similar projects.

What are some alternatives?

When comparing pykale and hierarchical-domain-adaptation you can also consider the following projects:

EasyOCR - Ready-to-use OCR with 80+ supported languages and all popular writing scripts including Latin, Chinese, Arabic, Devanagari, Cyrillic and etc.

pytorch-adapt - Domain adaptation made easy. Fully featured, modular, and customizable.

AdaTime - [TKDD 2023] AdaTime: A Benchmarking Suite for Domain Adaptation on Time Series Data

LLM-Adapters - Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"

Multimodal-Toolkit - Multimodal model for text and tabular data with HuggingFace transformers as building block for text data

Meta-SelfLearning - Meta Self-learning for Multi-Source Domain Adaptation: A Benchmark

social-balance - A library-agnostic project for calculating exactly and efficiently social balance, based on the Aref, Mason and Wilson paper (https://arxiv.org/abs/1611.09030)

open_flamingo - An open-source framework for training large multimodal models.

jina - ☁️ Build multimodal AI applications with cloud-native stack

valhalla-nmt - Code repository for CVPR 2022 paper "VALHALLA: Visual Hallucination for Machine Translation"