AliceMind
extreme-bert
AliceMind | extreme-bert | |
---|---|---|
1 | 2 | |
1,946 | 283 | |
1.3% | 0.0% | |
5.7 | 0.0 | |
about 2 months ago | about 1 year ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
AliceMind
-
[P]mPLUG-Owl: Modularization Empowers Large Language Models with Multimodality
Found relevant code at https://github.com/alibaba/AliceMind + all code implementations here
extreme-bert
-
[P] Releasing customized language model pre-training acceleration toolkit: ExtremeBERT
Found relevant code at https://github.com/extreme-bert/extreme-bert + all code implementations here
What are some alternatives?
RATransformers - RATransformers 🐭- Make your transformer (like BERT, RoBERTa, GPT-2 and T5) Relation Aware!
torchscale - Foundation Architecture for (M)LLMs
mPLUG-Owl - mPLUG-Owl & mPLUG-Owl2: Modularized Multimodal Large Language Model
primeqa - The prime repository for state-of-the-art Multilingual Question Answering research and development.
jina-financial-qa-search
pixel - Research code for pixel-based encoders of language (PIXEL)
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
happy-transformer - Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
transformer-mlm - Implementation of Transformer Encoders / Masked Language Modeling Objective