gpt-2-simple

Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts (by minimaxir)

Stats

Basic gpt-2-simple repo stats
0
2,618
1.7
about 1 month ago

minimaxir/gpt-2-simple is an open source project licensed under GNU General Public License v3.0 or later which is an OSI approved license.

Gpt-2-simple Alternatives

Similar projects and alternatives to gpt-2-simple based on common topics and language

  • GitHub repo AIdegger

    Extended publications of Martin Heidegger uncovered using machine learning.

  • GitHub repo RNN-Twitter-Bot

    🤖✏️ A Twitter bot written in Python trained with a recurrent neural network.

  • GitHub repo textgenrnn

    Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.

  • GitHub repo Spectrum

    Spectrum is an AI that uses machine learning to generate Rap song lyrics (by YigitGunduc)

  • GitHub repo stable-baselines

    A fork of OpenAI Baselines, implementations of reinforcement learning algorithms

  • GitHub repo stable-baselines3

    PyTorch version of Stable Baselines, reliable implementations of reinforcement learning algorithms.

  • GitHub repo rl-baselines-zoo

    A collection of 100+ pre-trained RL agents using Stable Baselines, training and hyperparameter optimization included.

NOTE: The number of mentions on this list indicates mentions on common posts. Hence, a higher number means a better gpt-2-simple alternative or higher similarity.

Posts

Posts where gpt-2-simple has been mentioned. We have used some of these posts to build our list of alternatives and similar projects - the last one was on 2021-03-18.
  • I trained GPT-2 on Heidegger texts and am proud to release a WORLD FIRST: the full text of the sequel to Being and Time: Being and Time 2.
    It's pretty easy - https://github.com/minimaxir/gpt-2-simple
  • [OC] The Infinite Pokedex: Using state of the art AI to generate endless Pokedex pages
    I used a library called gpt-2-simple. There's probably a more elegant way to handle this out there, but I used this library to finetune a gpt2 model (355M) on a dataset of all official pokedex entries. These entries were formatted [name, type1, type2, entry]. Then, when generating new entries this finetuned model was provided with a generated name and types as prefixes for each entry. I threw the python scripts used into the provided link, but I wouldn't take too much stock in their exact implementation. This was one of the more hacky elements of the generator.