Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Gpt-2-simple Alternatives
Similar projects and alternatives to gpt-2-simple
-
Style-Transfer-in-Text
Paper List for Style Transfer in Text
-
textgenrnn
Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
ctrl-sum
Resources for the "CTRLsum: Towards Generic Controllable Text Summarization" paper
-
rex-gym
OpenAI Gym environments for an open-source quadruped robot (SpotMicro)
-
gpt_index
Discontinued LlamaIndex (GPT Index) is a project that provides a central interface to connect your LLM's with external data. [Moved to: https://github.com/jerryjliu/llama_index]
-
openai-api-py-lite
OpenAI API Python bindings with no dependencies
-
nostalgebraist-autoresponder
Code for the tumblr bot nostalgebraist-autoresponder.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
AIdegger
Extended publications of Martin Heidegger uncovered using machine learning.
-
WhatsApp-Llama
Finetune a LLM to speak like you based on your WhatsApp Conversations
-
-
gpt-2-simple reviews and mentions
-
Show HN: WhatsApp-Llama: A clone of yourself from your WhatsApp conversations
Tap the contact's name in WhatsApp (I think it only works on a phone) and at the bottom of that screen there's Export Chat.
For finetuning GPT-2 I think I used this thing on Google Colab. (My friend ran it on his GPU, it should be doable on most modern-ish GPUs.)
https://github.com/minimaxir/gpt-2-simple
I tried doing something with this a few months ago though and it was a bit of a hassle to get running (needed to use a specific python version for some dependencies...), I forget the details sorry!
-
indistinguishable
I mentioned in a different reply that I used https://github.com/minimaxir/gpt-2-simple
-
training gpt on your own sources - how does it work? gpt2 v gpt3? and how much does it cost?
You will need a few hundred bucks, python experience, and a simple implementation such as this repo https://github.com/minimaxir/gpt-2-simple
-
Training GPT-2 with HuggingFace Transformers to sound like a certain author
gpt_2_simple is your best bet! Its super easy to use, you just need to downgrade TensorFlow and some other packages in your environment.
-
I trained GPT-2 on Heidegger texts and am proud to release a WORLD FIRST: the full text of the sequel to Being and Time: Being and Time 2.
It's pretty easy - https://github.com/minimaxir/gpt-2-simple
-
A note from our sponsor - InfluxDB
www.influxdata.com | 28 Mar 2024
Stats
minimaxir/gpt-2-simple is an open source project licensed under GNU General Public License v3.0 or later which is an OSI approved license.
The primary programming language of gpt-2-simple is Python.