textgenrnn
gpt-2-simple
Our great sponsors
textgenrnn | gpt-2-simple | |
---|---|---|
7 | 13 | |
4,943 | 3,366 | |
- | - | |
0.0 | 0.0 | |
almost 2 years ago | over 1 year ago | |
Python | Python | |
GNU General Public License v3.0 or later | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
textgenrnn
-
Modern alternative to textgenrnn?
Try this: 1) (Not sure if that's necessary.) Uninstall textgenrnn: pip3 uninstall textgenrnn. 2) Install it using one of this commands: * pip3 install git+git://github.com/minimaxir/textgenrnn.git * pip3 install git+https://github.com/minimaxir/textgenrnn.git (Try the first one, but if it'll raise an error, try the second one.) That's discussion about this "multi_gpu_model not found" error: https://github.com/minimaxir/textgenrnn/issues/222.
-
How do you get a Python IDE to execute Python code?
Thanks! PyCharm seems to work well enough. However, I'm not sure how to run an existing project. I successfully imported this github release: https://github.com/minimaxir/textgenrnn But I don't see any "main" file that can be executed. If I run the "setup.py" file, it complains that there is "no command to be executed." Also, import commands seem to fail, even though I literally have the files in the project that I'm trying to import. :/ I don't know how I'm supposed to run commands defined in the project if they can't be imported into a script.
-
I trained a neural network on every town and village name in England and then made a website that lets you generate them
This is an utterly fantastic question but I actually don't understand much about neural nets at all, and am just using a prebuilt python library is basically text based neural nets for idiots (https://github.com/minimaxir/textgenrnn) Happy to share data though if you want to find this out and would know how?
-
The Office - an episode generated by NN
Python package: Textgenrnn
-
makeitmetal.net - A metal band name generator using Python/Tensorflow
I started off with the model from textgenrnn, which is an RNN with LSTM layers. I'm not an expert on deep learning or different model architectures, and textgenrnn is a great starting point because you can use pretrained weights as a starting point for your model. I tweaked the model architecture slightly to allow multiple context labels, but most of the work on this project was porting the model to TensorFlow.js so it could run in the browser once I'd trained it in Python.
-
What's wrong with my import?
Is it installed? https://github.com/minimaxir/textgenrnn#usage
gpt-2-simple
-
Show HN: WhatsApp-Llama: A clone of yourself from your WhatsApp conversations
Tap the contact's name in WhatsApp (I think it only works on a phone) and at the bottom of that screen there's Export Chat.
For finetuning GPT-2 I think I used this thing on Google Colab. (My friend ran it on his GPU, it should be doable on most modern-ish GPUs.)
https://github.com/minimaxir/gpt-2-simple
I tried doing something with this a few months ago though and it was a bit of a hassle to get running (needed to use a specific python version for some dependencies...), I forget the details sorry!
-
indistinguishable
I mentioned in a different reply that I used https://github.com/minimaxir/gpt-2-simple
-
training gpt on your own sources - how does it work? gpt2 v gpt3? and how much does it cost?
You will need a few hundred bucks, python experience, and a simple implementation such as this repo https://github.com/minimaxir/gpt-2-simple
-
I (re)trained an AI using the 36 lessons of Vivec, the entirety of C0DA, the communist manifesto and the top posts of /r/copypasta and asked it the most important/unanswered lore questions. What are the lore implications of these insights?
I just used the gpt-2-simple python package and ran it overnight in an jupyter notebook, but you could copy the code to any python compiler and it should also work.
-
How do I start a personal project?
I'll note that if you're just doing text generation it is a simple project as far as ML goes, there are some nice libraries you can use that require minimal ML knowlege -eg https://github.com/minimaxir/gpt-2-simple
-
I created a twitter account that posts AI generated Canucks related tweets. I call it "Canucks Artificial Insider".
Then, I use the GPT-2 AI libraries, wrapped in a python library GPT-2 Simple to generate the content. My actual code is basically just their code sample, so basically 6 lines of python. With GPT-2, you train the existing AI to your specific dataset, which in my case is this text file of tweets.
-
Training GPT-2 with HuggingFace Transformers to sound like a certain author
gpt_2_simple is your best bet! Its super easy to use, you just need to downgrade TensorFlow and some other packages in your environment.
-
These Magic cards don't exist - Generating names for new cards using machine learning and GPT-2.
I used the GPT-2 Simple program by minimaxir to train the algorithm on every card in Magic's history that was released in a main expansion. Then I generated about 2,000 (it was actually more, but the algorithm really liked giving me cards that already exist) new names which I searched through to find the best ones.
-
No rush, mostly curious (training/finetuned models)
Might I suggest starting Starting here, to learn on Simple GPT2. They have a Google Colab Notebook if your CPU GPU is shit, and what helped me learn best is dissect the code, and basically make my own Colab notebook piece by piece, learning what each function does.
-
Selecting good hyper-parameters for fine-tuning a GPT-2 model?
The last couple of months, I've been running a Twitter bot that posts GPT-2-generated content, trained off of Tweets from existing accounts using gpt-2-simple. In my more recent training sessions, it seems like the quality of the output has been decreasing; it often gives outputs that are just barely modified from the original training data, if not verbatim.
What are some alternatives?
deepface - A Lightweight Face Recognition and Facial Attribute Analysis (Age, Gender, Emotion and Race) Library for Python
Style-Transfer-in-Text - Paper List for Style Transfer in Text
d2l-en - Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
ctrl-sum - Resources for the "CTRLsum: Towards Generic Controllable Text Summarization" paper
Spectrum - Spectrum is an AI that uses machine learning to generate Rap song lyrics
rex-gym - OpenAI Gym environments for an open-source quadruped robot (SpotMicro)
DeepAA - make ASCII Art by Deep Learning
openai-api-py-lite - OpenAI API Python bindings with no dependencies
autokeras - AutoML library for deep learning
gpt_index - LlamaIndex (GPT Index) is a project that provides a central interface to connect your LLM's with external data. [Moved to: https://github.com/jerryjliu/llama_index]
build_a_neural_net_live - The LSTM Neural net code for @Sirajology on Youtube's live video
AIdegger - Extended publications of Martin Heidegger uncovered using machine learning.