DialoGPT
GODEL
Our great sponsors
DialoGPT | GODEL | |
---|---|---|
7 | 5 | |
2,315 | 835 | |
1.0% | 1.1% | |
0.0 | 3.4 | |
over 1 year ago | 5 months ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
DialoGPT
- Just a thought
-
Mycroft AI companion
They recommend using https://github.com/microsoft/DialoGPT now, btw. That appears to be current and maintained, plus on a transformer (vs rnn only). Might be better long-term to migrate.
- DialoGPT finetuned on my own message data
-
I made a Python tool to help you know what to say!
I learned about GPT-3 and its strength as a generative model but couldn't access it yet (can't afford the API). Thankfully I found a GPT-2 based pre-trained model DialoGPT that was trained on Reddit.
-
AIstiny: The ultimate debate bot. Phase 0: Viability and call to the community for help
The most readily available technology to create a chatbot of the type we want (due to the type of data we have) is GPT-2 based DialoGPT. There are many papers and examples, for example this. If you have never heard of GPT-2 before, maybe you have heard of AI Dungeon, although currently it is run on GPT-3, the initial versions were based on GPT-2
-
Telegram Client+Bot that use Artificial Intelligence to waste scammers time
No it uses a chat AI based on GPT2 called DialoGPT
- [P] H5Records : Store large datasets in one single files with index access
GODEL
- Microsoft: Large-scale pretrained models for goal-directed dialog
-
Fine-tuning on Sales data?
I would use something like GODEL for something like this. https://github.com/microsoft/GODEL
- Godel: Large-Scale Pre-Training for Goal-Directed Dialog
-
Microsoft AI Researchers Open-Source 'GODEL': A Large Scale Pre-Trained Language Model For Dialog
Go to the github page here this is able to run on consumer hardware. The largest model they have need 2.7GB of memory. So running an instance will consume almost all the RAM in your GPU and you won't be able to use it for something else, but it will run on it.
-
"GODEL: Large-Scale Pre-Training for Goal-Directed Dialog", Peng et al 2022 {MS}
Github models to 2.7B: https://github.com/Microsoft/GODEL
What are some alternatives?
pistoBot - Create an AI that chats like you
rasa - 💬 Open source machine learning framework to automate text- and voice-based conversations: NLU, dialogue management, connect to Slack, Facebook, and more - Create chatbots and voice assistants
GPT2-Chinese - Chinese version of GPT2 training code, using BERT tokenizer.
Convoscope - AI tools to augment conversations on smart glasses, wearables, laptops, and smart meeting rooms.
DialogRPT - EMNLP 2020: "Dialogue Response Ranking Training with Large-Scale Human Feedback Data"
forte - Forte is a flexible and powerful ML workflow builder. This is part of the CASL project: http://casl-project.ai/
modular-diffusion - Python library for designing and training your own Diffusion Models with PyTorch.
namekrea - NameKrea is an AI Domain Name Generator which uses GPT-2
TalkToModel - TalkToModel gives anyone with the powers of XAI through natural language conversations 💬!
conversation-helper - GUI implementation of a Transformer chatbot. Suggests amicable responses to messages from friends.