simpletransformers
kiri
Our great sponsors
simpletransformers | kiri | |
---|---|---|
6 | 12 | |
3,984 | 240 | |
- | 0.0% | |
7.3 | 3.2 | |
about 1 month ago | almost 3 years ago | |
Python | Python | |
Apache License 2.0 | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
simpletransformers
-
Huggingface is a great idea poorly executed.
You might try this: https://github.com/ThilinaRajapakse/simpletransformers
-
Gpt 2 124m using transformers
https://github.com/ThilinaRajapakse/simpletransformers/blob/master/simpletransformers/language_generation/language_generation_model.py#L146
-
Neural Search Tutorial
Getting embeddings from BERT Encoder
-
Neural Search Step-by-Step
Tutorial includes: - What is the Neural Search? - Getting embeddings from BERT Encoder - Using vector search engine Qdrant - Creating an API server with FastAPI.
-
Document Classification
If you want to do text classification hugging face transformers is great. There's also a simple version for it: https://github.com/ThilinaRajapakse/simpletransformers
-
A Shortly like user interface for GPT 2?
Here is an example script to finetune a GPT-2 model: https://github.com/ThilinaRajapakse/simpletransformers/blob/master/examples/language_generation/fine_tune.py
kiri
-
[P][D] NLP question - Question Answering AI
I'm one of the authors of Backprop, a library built for transfer learning.
-
Backprop: Use and finetune models in a single line of code
I'd like to share Backprop, an open source library I've been co-authoring for the last few months.
-
[P] Backprop Model Hub: a curated list of state-of-the-art models
We've also got an open-source library that makes using + finetuning these models possible in a few lines of code.
- Show HN: Backprop – a simple library to use and finetune state-of-the-art models
- Show HN: Backprop – a library to easily finetune and use state-of-the-art models
-
[P] Backprop: a library to easily finetune and use state-of-the-art models
I'd like to share Backprop, a Python library I've been co-authoring for the last few months. Our goal is to make finetuning and using models as easy as possible, even without extensive ML experience.
-
GPT Neo: open-source GPT-3-like model with pretrained weights available
You might get some really promising results with finetuning.
If anything, you could build writing assistance that almost automates responses.
I've been co-authoring a library that lets you finetune such models in a single line of code.
https://github.com/backprop-ai/backprop
In specific the text generation finetuning example should be what you are looking for: https://github.com/backprop-ai/backprop/blob/main/examples/F...
Hope this helps, happy to chat more about it. Pretty curious about the results.
-
NLP Model for extracting specific text from raw text
Here's an example Jupyter Notebook for finetuning T5. Full disclosure, I work on this library myself -- but it could be helpful.
-
[D] Need help with document classifier and later prediction of text
I'm working on a library that hopefully makes working with some of these a bit easier -- here's an example notebook for running text classification with the BART checkpoint, if you're interested. If you need more task-specific finetuning for text classification, that's going to be rolled out in the near future.
-
Generating notes from text
I'm working on a library that includes a few different ML tasks, including summarisation. It uses a pretrained version of Google's T5 transformer model, which we host on Hugging Face with some details on how it was trained.
What are some alternatives?
BERTweet - BERTweet: A pre-trained language model for English Tweets (EMNLP-2020)
gpt-neox - An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
minGPT - A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
qagnn - [NAACL 2021] QAGNN: Question Answering using Language Models and Knowledge Graphs 🤖
layout-parser - A Unified Toolkit for Deep Learning Based Document Image Analysis
CLIP - CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
fastapi - FastAPI framework, high performance, easy to learn, fast to code, ready for production
Questgen.ai - Question generation using state-of-the-art Natural Language Processing algorithms
rasa - 💬 Open source machine learning framework to automate text- and voice-based conversations: NLU, dialogue management, connect to Slack, Facebook, and more - Create chatbots and voice assistants
haystack - :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.