SaaSHub helps you find the best software and product alternatives Learn more →
Aitextgen Alternatives
Similar projects and alternatives to aitextgen
-
-
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
-
gpt-neo
Discontinued An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
-
guidance
Discontinued A guidance language for controlling large language models. [Moved to: https://github.com/guidance-ai/guidance] (by microsoft)
-
nanoGPT
The simplest, fastest repository for training/finetuning medium-sized GPTs.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
gpt-2
Code for the paper "Language Models are Unsupervised Multitask Learners"
-
-
lm-evaluation-harness
A framework for few-shot evaluation of language models.
-
serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.
-
hivemind
Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.
-
memos
An open source, lightweight note-taking service. Easily capture and share your great thoughts.
-
simpleaichat
Python package for easily interfacing with chat apps, with robust features and minimal code complexity.
-
DiscordChatAI-GPT2
A chat AI discord bot written in python3 using GPT-2, trained on data scraped from every message of my discord server (can be trained on yours too)
-
cramming
Cramming the training of a (BERT-type) language model into limited compute.
-
-
-
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
aitextgen reviews and mentions
-
Where is the engineering part in "prompt engineer"?
It's literally a wrapper for the ChatGPT API (currently). I have another library for training models from scratch but haven't had time to work on it.
-
self-hosted AI?
I'm experimenting with https://github.com/minimaxir/aitextgen for some some simple tasks. It is pretty much a wrapper around gpt2 and gpt neox models.
-
NanoGPT
To train small gpt-like models, there's also aitextgen: https://github.com/minimaxir/aitextgen
-
Downloaded GPT-2, Encode.py, and Train.py not found.
If by downloaded you mean clone the gpt-2 github repo it doesn't come with those scripts. I personally played around with https://github.com/minimaxir/aitextgen which is a simple wrapper around the gpt-2 code, it comes with some very clear usage. (Shout out to minimaxir and everyone else involved in aitextgen for making using gpt-2 easy to use!)
-
OpenAI’s API Now Available with No Waitlist
AI text content generation is indeed a legit industry that's still in its nascent stages. It's why I myself have spent a lot of time working with it, and working on tools for fully custom text generation models (https://github.com/minimaxir/aitextgen).
However, there are tradeoffs currently. In the case of GPT-3, it's cost and risk of brushing against the Content Guidelines.
There's also the surprisingly underdiscussed risk of copyright of generated content. OpenAI won't enforce their own copyright, but it's possible for GPT-3 to output existing content verbatim which is a massive legal liability. (it's half the reason I'm researching custom models fully trained with copyright-safe content)
-
I made Gravital, a Discord AI chatbot that trains from your own server's message history
Eventually, I discovered aitextgen, a newer and much better library for using GPT-2 by the same creator as got-2-simple. After deciding to merge the best of both worlds and sprinkle in a bit of my own spice, Gravital was born!
-
Pytorch incorrect paging file size
It seems as though there is a lot going on, and what I found through research couldn't be directly applied; for example I heard you should set num_workers to zero, but since the training isn't provided from pytorch and instead from the package aitextgen, I don't have direct access to those properties.
-
AI Can Generate Convincing Text–and Anyone Can Use It
As someone who works on a Python library solely devoted to making AI text generation more accessible to the normal person (https://github.com/minimaxir/aitextgen ) I think the headline is misleading.
Although the article focuses on the release of GPT-Neo, even GPT-2 released in 2019 was good at generating text, it just spat out a lot of garbage requiring curation, which GPT-3/GPT-Neo still requires albeit with a better signal-to-noise ratio.
GPT-Neo, meanwhile, is such a big model that it requires a bit of data engineering work to get operating and generating text (see the README: https://github.com/EleutherAI/gpt-neo ), and it's unclear currently if it's as good as GPT-3, even when comparing models apples-to-apples.
That said, Hugging Face is adding support for GPT-Neo to Transformers (https://github.com/huggingface/transformers/pull/10848 ) which will help make playing with the model easier, and I'll add support to aitextgen if it pans out.
- Show HN: GPT-2 Twitter Bot
-
Replicating GPT-2 at Home
As someone who maintains a package to both make it easy to fine-tune GPT-2 or create your own from scratch (https://github.com/minimaxir/aitextgen), this submission is a good run-through of the technical consideration toward building a GPT-2 model.
It's both substantially easier and faster than it was when OpenAI released their paper in 2019, thanks to both Huggingface Transformers and Tokenizers making the architectures more efficient and other companies streamline the training process and make it more efficient.
You don't need a TPU cluster to train a working GPT-2 model, although it helps (unfortunately TPU support on PyTorch-based training like aitextgen is more fussy). A free GPU on Colab gets you most of the way, especially since you can get now a T4 or a V100 which lets you use FP16.
-
A note from our sponsor - SaaSHub
www.saashub.com | 17 Apr 2024
Stats
minimaxir/aitextgen is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of aitextgen is Python.