AliceMind
RATransformers
AliceMind | RATransformers | |
---|---|---|
1 | 1 | |
1,946 | 41 | |
1.3% | - | |
5.7 | 0.0 | |
about 2 months ago | over 1 year ago | |
Python | Python | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
AliceMind
-
[P]mPLUG-Owl: Modularization Empowers Large Language Models with Multimodality
Found relevant code at https://github.com/alibaba/AliceMind + all code implementations here
RATransformers
What are some alternatives?
mPLUG-Owl - mPLUG-Owl & mPLUG-Owl2: Modularized Multimodal Large Language Model
extreme-bert - ExtremeBERT is a toolkit that accelerates the pretraining of customized language models on customized datasets, described in the paper “ExtremeBERT: A Toolkit for Accelerating Pretraining of Customized BERT”.
jina-financial-qa-search
pixel - Research code for pixel-based encoders of language (PIXEL)
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
transformer-mlm - Implementation of Transformer Encoders / Masked Language Modeling Objective
frame-semantic-transformer - Frame Semantic Parser based on T5 and FrameNet
happy-transformer - Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
lightning-mlflow-hf - Use QLoRA to tune LLM in PyTorch-Lightning w/ Huggingface + MLflow