gpt-neo
aitextgen
gpt-neo | aitextgen | |
---|---|---|
82 | 19 | |
6,158 | 1,826 | |
- | - | |
7.3 | 1.8 | |
about 2 years ago | 10 months ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gpt-neo
-
How Open is Generative AI? Part 2
By December 2020, EleutherAI had introduced The Pile, a comprehensive text dataset designed for training models. Subsequently, tech giants such as Microsoft, Meta, and Google used this dataset for training their models. In March 2021, they revealed GPT-Neo, an open-source model under Apache 2.0 license, which was unmatched in size at its launch. EleutherAI’s later projects include the release of GPT-J, a 6 billion parameter model, and GPT-NeoX, a 20 billion parameter model, unveiled in February 2022. Their work demonstrates the viability of high-quality open-source AI models.
-
Creating an open source chat bot like ChatGPT for my own dataset without GPU?
Yeah, if that is your requirement you should definitely ignore chatterbot, as its older and probably not what your teacher wants. I'm looking at the gpt-neo docs right now: https://github.com/EleutherAI/gpt-neo
-
Any real competitor to GPT-3 which is open source and downloadable?
3.) EleutherAI's GPT-Neo and GPT-NeoX: EleutherAI is an independent research organization that aims to promote open research in artificial intelligence. They have released GPT-Neo, an open-source language model based on the GPT architecture, and are developing GPT-NeoX, a highly-scalable GPT-like model. You can find more information on their GitHub repositories: GPT-Neo: https://github.com/EleutherAI/gpt-neo GPT-NeoX: https://github.com/EleutherAI/gpt-neox
-
⚡ Neural - AI Code Generation for Vim
This is one of the first comprehensive plugins that has been rewritten to support multiple AI backends such as OpenAI GPT3+ and other custom sources in the future such as ChatGPT, GPT-J, GPT-neo and more.
-
Looks like some Taliban fighters are getting burnt out working the 9-5 grind
GPT-Neo is newer than GPT-2 on the open source side of things. In my experience, it tends to give longer and more creative responses than GPT-2 but not on the level of GPT-3. I've not tried GPT-J or GPT-NeoX, but they're also open source and reportedly better than GPT-Neo (albeit less accessible).
- H3 - a new generative language models that outperforms GPT-Neo-2.7B with only *2* attention layers! In H3, the researchers replace attention with a new layer based on state space models (SSMs). With the right modifications, they find that it can outperform transformers.
- First Open Source Alternative to ChatGPT Has Arrived
-
Where is the line for AI and where does ChatGPT stand?
Finally, yes-- it is trained via masked language modeling (text prediction). The approach has been fairly standard for years- the big difference with the GPT* models is the number of paramaters and volume of text-- we still haven't reached a ceiling with LLM parameters- they appear to keep improving with size. This training allows the model to learn a strong representation of language. Their training approach is published and open-source GPT* versions have already been made and released (https://github.com/EleutherAI/gpt-neo). However, the models are huge and can't be run locally for hobbyists. This gets at larger issues in democratization of ML.
- Using the GPT-3 AI Writer inside Obsidian(This is COOL)
-
Teaser trailer for "The Diary of Sisyphus" (2023), the world's first feature film written by an artificial intelligence (GPT-NEO) and produced Briefcase Films, my indie film studio based in Northern Italy
- GPT-Neo 2.7B, released Mar/2021, and unmaintained/unsupported as of Aug/2021? or;
aitextgen
-
Where is the engineering part in "prompt engineer"?
It's literally a wrapper for the ChatGPT API (currently). I have another library for training models from scratch but haven't had time to work on it.
-
self-hosted AI?
I'm experimenting with https://github.com/minimaxir/aitextgen for some some simple tasks. It is pretty much a wrapper around gpt2 and gpt neox models.
-
How would I go about implementing warmup steps from the Transformers library?
I'm sorry if this is the wrong place to ask, but I wasn't sure where else to turn. Several of us have already opened an issue with AITextGen, but it seems that the maintainer isn't particularly active these days. I'm a fairly proficient developer (self-taught), and I know my way around ML, but I was not formally-educated in deep learning. A lot of Pytorch-Lightning looks like black magic, to me. I suspect that I'm missing an important detail that would be fairly simple for many of you to identify.
-
NanoGPT
To train small gpt-like models, there's also aitextgen: https://github.com/minimaxir/aitextgen
-
Neuro-sama sings "Take On Me" with her Angelic Voice
It's actually relatively easy to train your own GPT model and there are multiple tools out there that make it almost just plug and play: https://github.com/minimaxir/aitextgen
-
Is there a place with all the models indexed?
I've been learning python and for the past few days, I've been playing around with the aitextgen library.
-
I built an AI model to auto-generate Dominion cards. Here are the hilariously bad results.
Then I ran that through the ai and got it to spit out cards that looked like that training data. I used aitextgen. So I let it run for like 4 hours and it thinks it has made 10,000 rows of cards. But some of these cards are duplicates to each other or to cards that already exist, or use a card name that already exists in the original game, or have like 20 '|' characters in one row, or have zero '|'. So I run a script to remove all of these cards like that, and I end up with like 2,000-4,500 cards that are "functional".
-
Thoughts on GPT3?
If you search this subreddit, you should find lots of discussions about it, as well as alternatives like GPT-J (open source). If you'd like to experiment with GPT-2 for text generation, try https://github.com/minimaxir/aitextgen. It's fun to play with.
-
Show HN: Tensorpedia – Using GPT-2 to synthesize Wikipedia articles
Hey HN! I've been lurking for a while now and I've finally created something that I feel is worth sharing.
I've called this project "Tensorpedia." At its core, Tensorpedia takes in a title and utilizes it as a prompt for GPT-2 to synthesize the introductory part of a Wikipedia article. The machine learning stuff is written using a wonderful library called aitextgen [0], using Wikipedia's "Vital Articles" as a data set [1]. The server is written in Node, and it uses Redis as an article cache. If you want to read my article about it (for some reason), you can check it out here [2].
I created this project to get more experience with server technologies. While I wouldn't say it's a complicated application, I learned quite a lot from it.
Additionally, as I was inspired by all of those this-x-doesn't-exist projects from a while back, this project is mostly for fun. As such, I don't know how much practical use it has, but I've generated some pretty hilarious articles from it.
[0] https://github.com/minimaxir/aitextgen
[1] https://en.wikipedia.org/wiki/Wikipedia:Vital_articles/Level...
[2] https://jonahsussman.net/posts/2022-01-this-wiki-dne/
-
Downloaded GPT-2, Encode.py, and Train.py not found.
If by downloaded you mean clone the gpt-2 github repo it doesn't come with those scripts. I personally played around with https://github.com/minimaxir/aitextgen which is a simple wrapper around the gpt-2 code, it comes with some very clear usage. (Shout out to minimaxir and everyone else involved in aitextgen for making using gpt-2 easy to use!)
What are some alternatives?
gpt-neox - An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
lm-evaluation-harness - A framework for few-shot evaluation of language models.
haystack - :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
DiscordChatAI-GPT2 - A chat AI discord bot written in python3 using GPT-2, trained on data scraped from every message of my discord server (can be trained on yours too)
openchat - OpenChat: Easy to use opensource chatting framework via neural networks
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
tensorflow - An Open Source Machine Learning Framework for Everyone
nanoGPT - The simplest, fastest repository for training/finetuning medium-sized GPTs.
mesh-transformer-jax - Model parallel transformers in JAX and Haiku
trump_gpt2_bot - aitextgen (aka GPT-2) Twitter bot
gpt4all - gpt4all: run open-source LLMs anywhere