SaaSHub helps you find the best software and product alternatives Learn more →
GPTeacher Alternatives
Similar projects and alternatives to GPTeacher
-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
character-editor
Create, edit and convert AI character files for CharacterAI, Pygmalion, Text Generation, KoboldAI and TavernAI
-
ue5-llama-lora
A proof-of-concept project that showcases the potential for using small, locally trainable LLMs to create next-generation documentation tools.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
llm-jeopardy
Automated prompting and scoring framework to evaluate LLMs using updated human knowledge prompts
GPTeacher reviews and mentions
-
Pygmalion Dataset Availability
If you're looking for something similar to the RP aspect of Pygmalion, then the GPTeacher Roleplay dataset is the closest available: https://github.com/teknium1/GPTeacher
- GitHub - teknium1/GPTeacher: A collection of modular datasets generated by GPT-4, General-Instruct - Roleplay-Instruct - Code-Instruct - and Toolformer
-
New Llama 13B model from Nomic.AI : GPT4All-13B-Snoozy. Available on HF in HF, GPTQ and GGML
The 13B version uses the general-instruct GPTeacher dataset from teknium. In the models wiki, I distinguish between the two by referring to them as GPT4 Alpaca for 30B and the original name GPT4 x Alpaca for 13B.
-
What’s the current best model that will run well locally on a 3090?
No, GPT4 x Alpaca, GPT4 Alpaca, and GPT4All use different datasets. GPT4 x Alpaca uses GPTeacher, GPT4 Alpaca uses Microsoft Research's GPT-4-LLM, and GPT4All uses their own. GPT4All is commonly considered to be the worst out of all of them in the general community.
-
Best datasets for local training?
GPT4-alpaca dataset: https://github.com/teknium1/GPTeacher
-
GPT4-X-Alpaca 30B 4-bit, by MetaIX based on LoRA by chansung
For anyone wondering how this compares with the 13B GPT4 x Alpaca, the dataset used is different. The 13B GPT4xAlpaca uses the GPTeacher dataset, while this uses the Microsoft Research dataset from Instruction Tuning with GPT-4. It should be a direct upgrade to Stanford's Alpaca, and I'll add it to the wiki as GPT4 Alpaca without an x to differentiate it.
-
[P] The weights neccessary to construct Vicuna, a fine-tuned LLM with capabilities comparable to GPT3.5, has now been released
The dataset is here: https://huggingface.co/chavinlo/gpt4-x-alpaca/discussions/1#642920c0b20cdada12fa7d20
-
A note from our sponsor - SaaSHub
www.saashub.com | 4 May 2024
Stats
teknium1/GPTeacher is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of GPTeacher is Python.
Sponsored