awesome-pretrained-chinese-nlp-models
educational-transformer
awesome-pretrained-chinese-nlp-models | educational-transformer | |
---|---|---|
1 | 1 | |
4,250 | 2 | |
- | - | |
8.9 | 3.8 | |
4 days ago | 8 months ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
awesome-pretrained-chinese-nlp-models
educational-transformer
What are some alternatives?
ai_and_memory_wall - AI and Memory Wall
DialoGPT - Large-scale pretraining for dialogue
rust-bert - Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
gpt-neo - An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
PaddleNLP - 👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis etc.
RWKV-LM - RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
ERNIE-text-classification-pytorch - This repo contains a PyTorch implementation of a pretrained ERNIE model for text classification.
LoRA - Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
TencentPretrain - Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo