SaaSHub helps you find the best software and product alternatives Learn more →
Adapters Alternatives
Similar projects and alternatives to adapters
-
haystack
:mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
JointBERT
Pytorch implementation of JointBERT: "BERT for Joint Intent Classification and Slot Filling"
-
trankit
Trankit is a Light-Weight Transformer-based Python Toolkit for Multilingual Natural Language Processing
-
siamese-nn-semantic-text-similarity
A repository containing comprehensive Neural Networks based PyTorch implementations for the semantic text similarity task, including architectures such as: Siamese LSTM Siamese BiLSTM with Attention Siamese Transformer Siamese BERT.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
LLM-Adapters
Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
adapters reviews and mentions
-
[D] NLP question: does fine-tuning train input embedding?
Usually in computer vision resnets, people finetune only the last layers, but in NLP you tune the entire model. There are also plenty of instances where people try to not do this, such as in adapters, however.
-
[P] AdapterHub v2: Lightweight Transfer Learning with Transformers and Adapters
GitHub: https://github.com/Adapter-Hub/adapter-transformers
-
Our new state-of-the-art multilingual NLP Toolkit - Trankit has been released
Thanks for the question. The main libraries that Trankit's using are pytorch and adapter-transformers. For the GPU requirement, we have tested our toolkit on different scenarios and found that a single GPU with 4GB of memory would be enough for a comfortable use.
-
A note from our sponsor - SaaSHub
www.saashub.com | 23 Apr 2024
Stats
adapter-hub/adapters is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of adapters is Jupyter Notebook.
Sponsored