PIXIU
happy-transformer
PIXIU | happy-transformer | |
---|---|---|
6 | 1 | |
406 | 503 | |
8.9% | - | |
8.9 | 9.0 | |
7 days ago | about 2 months ago | |
Jupyter Notebook | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
PIXIU
happy-transformer
-
GPT-Neo-125M-AID (Mia) oversight + retrained
This appears to be an actual issue with Happy Transformer judging by a GitHub issue I've found of the same problem.
What are some alternatives?
spacy-llm - 🦙 Integrating LLMs into structured NLP pipelines
FARM - :house_with_garden: Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
Baichuan-13B - A 13B large language model developed by Baichuan Intelligent Technology
transformers-interpret - Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Baichuan-7B - A large-scale 7B pretraining language model developed by BaiChuan-Inc.
FinBERT-QA - Financial Domain Question Answering with pre-trained BERT Language Model
chatgpt-extractive-shortener - Shortens a paragraph of text with ChatGPT, using successive rounds of word-level extractive summarization.
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
spaCy - 💫 Industrial-strength Natural Language Processing (NLP) in Python
small-text - Active Learning for Text Classification in Python
ARElight - Granular Viewer of Sentiments Between Entities in Massively Large Documents and Collections of Texts, powered by AREkit
gector - Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite" (BEA-20) and "Text Simplification by Tagging" (BEA-21)