EleutherAI/gpt-neo is an open source project licensed under MIT License which is an OSI approved license.
Similar projects and alternatives to gpt-neo
A framework for few-shot evaluation of autoregressive language models.
A community-curated one-stop-shop of resources and information for all things New Golem
Scout APM - Leading-edge performance monitoring starting at $39/month. Scout APM uses tracing logic that ties bottlenecks to source code so you know the exact line of code causing performance issues and can get back to building a great product faster.
An implementation of model parallel GPT-3-like models on GPUs, based on the DeepSpeed library. Designed to be able to train models in the hundreds of billions of parameters or larger.
Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
:mag: End-to-end Python framework for building natural language search interfaces to data. Leverages Transformers and the State-of-the-Art of NLP. Supports DPR, Elasticsearch, Hugging Face’s Hub, and much more!
A robust Python tool for text-based AI training and generation using GPT-2.
Easy to use opensource chatting framework via neural networks
Optical Simulation software
The simplest app state management for React
(Deprecated) A hub for onboarding & other information.
Fifteen days ago, this beast was published.
reddit.com/r/ArtificialInteligence | 2021-04-15
Idk if the site got the hug of death or what. But https://www.eleuther.ai/ or their github at https://github.com/EleutherAI seem to work for me.
Try to talk with GPT3 (GPT-Neo)
reddit.com/r/ArtificialInteligence | 2021-04-07
The GPT-Neo model was released in the EleutherAI/gpt-neo repository by Sid Black, Stella Biderman, Leo Gao, Phil Wang and Connor Leahy. It is a GPT3 like causal language model trained on the Pile dataset.
Recommendations for a beginner?
reddit.com/r/GPT3 | 2021-03-31
I'm an engineer, mostly web, but am fascinated by AI and would like to learn more. I found this repo that looks like a great start https://github.com/EleutherAI/gpt-neo but at the same time is a little intimidating. There are so many esoteric terms used that if feels as if I am at stage 1 again. AI masters of this subreddit, how did you ramp up and if you could do it all over again are there any resources and/or things you would have done differently to get you where you are now?
Eluther: A grassroots collective of researchers working to open source AI
news.ycombinator.com | 2021-03-31
AI Can Generate Convincing Text–and Anyone Can Use It
news.ycombinator.com | 2021-03-29
As someone who works on a Python library solely devoted to making AI text generation more accessible to the normal person (https://github.com/minimaxir/aitextgen ) I think the headline is misleading.
Although the article focuses on the release of GPT-Neo, even GPT-2 released in 2019 was good at generating text, it just spat out a lot of garbage requiring curation, which GPT-3/GPT-Neo still requires albeit with a better signal-to-noise ratio.
GPT-Neo, meanwhile, is such a big model that it requires a bit of data engineering work to get operating and generating text (see the README: https://github.com/EleutherAI/gpt-neo ), and it's unclear currently if it's as good as GPT-3, even when comparing models apples-to-apples.
That said, Hugging Face is adding support for GPT-Neo to Transformers (https://github.com/huggingface/transformers/pull/10848 ) which will help make playing with the model easier, and I'll add support to aitextgen if it pans out.
To the guys whose exsistence on this subreddit revolves around complaining and proclaiming that you are going to boycott the game, here's a suggestion: start your own fork of the game. You know, so you would shut up about the devs pocketing your money and not living up to your standards.
reddit.com/r/AIDungeon | 2021-03-27
There's an open GPT-3 project up to around 2 billion parameters. A far cry from 175 billion, but it's on its way. https://github.com/EleutherAI/gpt-neo
China’s GPT-3? BAAI Introduces Superscale Intelligence Model ‘Wu Dao 1.0’ (2.6 billion parameters)
reddit.com/r/mlscaling | 2021-03-23
It sounds like a university was doing this though. If they get an industry partner they could scale it up. This also says it’s open source (though I don’t think I saw a link?). That would put them on competitive footing with Eleuther.
GPT-3 tries pickup lines
news.ycombinator.com | 2021-03-23
I tried the same prompt on the open-source recreation of OpenAI GPT-3 -- GPT Neo (https://github.com/EleutherAI/gpt-neo), specifically on their 2.7B model -- which should correspond to the smallest model in the article (Ada) that produced just pure garbage. The result is surprisingly good:
1. How did this little blossom happen? When are you going to bloom?news.ycombinator.com | 2021-03-23
EleutherAI releases gpt-neo models trained on GPT3, GPT2
reddit.com/r/LanguageTechnology | 2021-03-23
Hacker News top posts: Mar 22, 2021
reddit.com/r/hackerdigest | 2021-03-22
GPT Neo: open-source GPT model, with pretrained 1.3B & 2.7B weight models\ (106 comments)
[P] EleutherAI releases 1.3B and 2.7B GPT-3-style models trained on the Pile
reddit.com/r/MachineLearning | 2021-03-21
The GPT-Neo project by EleutherAI has released 1.3B and 2.7B parameter GPT-3-style models. The models are trained on the Pile, a 800 GB curated dataset EleutherAI released in January.
GPT Neo: open-source GPT model, with pretrained 1.3B &amp; 2.7B weight models
reddit.com/r/patient_hackernews | 2021-03-21
GPT Neo: open-source GPT model, with pretrained 1.3B & 2.7B weight models
reddit.com/r/hackernews | 2021-03-21
EleutherAI collective releases its GPT-2-1.3b/GPT-2-2.7b language models trained on 'The Pile' dataset
reddit.com/r/mlscaling | 2021-03-21