gector
adaptnlp
Our great sponsors
gector | adaptnlp | |
---|---|---|
2 | 2 | |
863 | 414 | |
1.4% | 0.0% | |
0.0 | 0.0 | |
9 months ago | over 2 years ago | |
Python | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gector
-
ML application for grammar correction
If you capture it, you might as well correct it. Check out Gramformer or Grammarly's Gector. You can do scoring based on number of mistakes proposed by these models i.e. the fewer, the better.
- Is there any way to detect grammatical errors and classify text as being either grammatically correct/incorrect?
adaptnlp
-
Tools to use for Semantic-searching Question Answering System
Check out adaptnlp
-
Case Sensitivity using HuggingFace & Google's T5 model (base)
Yes, there are capitals in the tokenizer vocabulary of t5-base and t5-small, so both support capitalization. A few days ago I was using t5-small through adaptnlp for extractive summarization and capitalization was working fine (https://github.com/Novetta/adaptnlp). AdaptNLP is basically just a transformers wrapper, so if you can't figure out a solution, you could just dissect their source code.
What are some alternatives?
Gramformer - A framework for detecting, highlighting and correcting grammatical errors on natural language text. Created by Prithiviraj Damodaran. Open to pull requests and other forms of collaboration.
Basic-UI-for-GPT-J-6B-with-low-vram - A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
keytotext - Keywords to Sentences
spark-nlp - State of the Art Natural Language Processing
fastai - The fastai deep learning library
DeBERTa - The implementation of DeBERTa
browser-ml-inference - Edge Inference in Browser with Transformer NLP model
happy-transformer - Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
Transformers-Tutorials - This repository contains demos I made with the Transformers library by HuggingFace.
bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
ML-Workspace - 🛠 All-in-one web-based IDE specialized for machine learning and data science.