every-chatgpt-gui
awesome-instruction-dataset
every-chatgpt-gui | awesome-instruction-dataset | |
---|---|---|
6 | 6 | |
1,818 | 1,005 | |
- | - | |
7.5 | 6.2 | |
8 days ago | 4 months ago | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
every-chatgpt-gui
-
GPT 4 new limits only 40 messages in 3 days
There are several UIs you can use: list on GitHub
-
ChatGPT needs its own desktop application
Check out this list of chatgpt desktop applications:https://github.com/billmei/every-chatgpt-gui There was a discussion about this here:https://www.reddit.com/r/ChatGPT/comments/18eer32/i_am_so_close_to_cancelling_my_pro_subscription/
- GPT Message limit is lying?
-
I am so close to cancelling my pro subscription.
You can use anyone from here: https://github.com/billmei/every-chatgpt-gui
- How can I access ChatGPT from work computer.
- Show HN: Every front-end UI for ChatGPT
awesome-instruction-dataset
-
[D] What are the best Open Source Instruction-Tuned LLMs ? Is there any benchmark on instruction datasets ?
I found some Instruction Tuning Datasets (link) but I can't seem to find a benchmark of the best LLMs on those kind of datasets?
-
Datasets to train your own GPT-4 (image input GPT) and training codes for people interested in training LLMs
A repo (https://github.com/yaodongC/awesome-instruction-dataset) designed to provide a one-stop solution for all your LLM dataset needs! Perfect for experimenting with your own ChatGPT/LLM (MiniGPT4, Alpaca, LLaMA) model.
- Comprehensive Datasets for Training GPT-4 like (LLM) Models, and training repo collection. Including datasets for Multi-modal LLM models.
- Collection of datasets to train your own multi-modal GPT-4/LLMs
-
Datasets to train your own GPT-4 (image input GPT) for people interested in training LLMs
In case anyone want to train their own GPT-4 models, here is list of datasets for training LLMs (https://github.com/yaodongC/awesome-instruction-dataset) .
-
🚀 Comprehensive List of Datasets for Training LLaMA Models (GPT-4 & Beyond) ðŸ§
I've put together an extensive collection of instruction-based datasets perfect for experimenting with your own LLaMA model and beyond (https://github.com/yaodongC/awesome-instruction-dataset) . If you've been searching for resources to advance your own projects or simply want to learn more about these cutting-edge models, this repository is just what you need!
What are some alternatives?
vanilla-chatgpt - a minimal ChatGPT client by vanilla javascript, run from local or any static web host
gpt4free - The official gpt4free repository | various collection of powerful language models
poe_sidebar_robots_remover - Remove Poe.com useless Robots from Sidebar
chatgpt-vscode - Your best AI pair programmer in VS Code
localGPT - Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
siri-gpt - Voice controlled ChatGPT for iOS using Shortcuts with temporary memory to carry extended conversations
awesome-gpt - A curated list of awesome ChatGPT-related applications, software, tools, resources.
Instructgpt-prompts - A collection of ChatGPT and GPT-3.5 instruction-based prompts for generating and classifying text.
openai-gpt4 - decentralising the Ai Industry, free gpt-4/3.5 scripts through several reverse engineered api's ( poe.com, phind.com, chat.openai.com, phind.com, writesonic.com, sqlchat.ai, t3nsor.com, you.com etc...) [Moved to: https://github.com/xtekky/gpt4free]
FlexGen - Running large language models like OPT-175B/GPT-3 on a single GPU. Focusing on high-throughput generation. [Moved to: https://github.com/FMInference/FlexGen]
LongForm - Reverse Instructions to generate instruction tuning data with corpus examples