keytotext
adaptnlp
Our great sponsors
keytotext | adaptnlp | |
---|---|---|
5 | 2 | |
435 | 414 | |
- | 0.0% | |
3.1 | 0.0 | |
8 months ago | over 2 years ago | |
Jupyter Notebook | Jupyter Notebook | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
keytotext
- [Machine Learning] [P] Implémentation de la génération de texte à partir de mots clés Python Module
-
[P] Implementation of text generation from keywords Python Module
I am trying to find a module with a key to text generator using NLP models, I have been using "keytotext" (https://github.com/gagan3012/keytotext) which has been working really well up until now but today it seems like the models have been taken down from https://huggingface.co/models.
-
Library that takes a pool of words and spits out sentences with only those words?
This library can generate sentences based on the given keywords using T5. I feel like this is probably close to what you are looking for.
- Keytotext Convert Keywords to Large Texts
-
Keytotext
Hello, Presenting Keytotext: Keytotext is an NLP model that can convert keywords to sentences and larger texts. It is built using the T5 model. Keytotext has a PyPI installation and on-demand inference API too. It also features a UI built using streamlit and a GPU-enabled colab notebook for easy usage! Please do check it out on GitHub: https://github.com/gagan3012/keytotext Please to star 📷 if you liked the work!
adaptnlp
-
Tools to use for Semantic-searching Question Answering System
Check out adaptnlp
-
Case Sensitivity using HuggingFace & Google's T5 model (base)
Yes, there are capitals in the tokenizer vocabulary of t5-base and t5-small, so both support capitalization. A few days ago I was using t5-small through adaptnlp for extractive summarization and capitalization was working fine (https://github.com/Novetta/adaptnlp). AdaptNLP is basically just a transformers wrapper, so if you can't figure out a solution, you could just dissect their source code.
What are some alternatives?
ML-Workspace - 🛠 All-in-one web-based IDE specialized for machine learning and data science.
Basic-UI-for-GPT-J-6B-with-low-vram - A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
mt5-M2M-comparison - Comparing M2M and mT5 on a rare language pairs, blog post: https://medium.com/@abdessalemboukil/comparing-facebooks-m2m-to-mt5-in-low-resources-translation-english-yoruba-ef56624d2b75
fastai - The fastai deep learning library
aws-lambda-docker-serverless-inference - Serve scikit-learn, XGBoost, TensorFlow, and PyTorch models with AWS Lambda container images support.
browser-ml-inference - Edge Inference in Browser with Transformer NLP model
Machine-Learning-Cyrillic-Classifier - This is a web app where you can draw a letter in the russian alphabet and the ML algorithm will predict the letter that you drew.
gector - Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite" (BEA-20) and "Text Simplification by Tagging" (BEA-21)
predict-subreddit - NLP model that predicts subreddit based on the title of a post
Transformers-Tutorials - This repository contains demos I made with the Transformers library by HuggingFace.
mfp-wrapped - Data app to provide analytics for myfitnesspal users: a calorie counter and food journal
Deep-Learning-Experiments - Videos, notes and experiments to understand deep learning