rex-gym
gpt-2-simple
Our great sponsors
rex-gym | gpt-2-simple | |
---|---|---|
1 | 13 | |
957 | 3,366 | |
- | - | |
0.0 | 0.0 | |
about 1 year ago | over 1 year ago | |
Python | Python | |
Apache License 2.0 | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
rex-gym
-
By moving the battery pack forward, you can make the popular SpotMicro design balance much better. We had trouble getting it to do standing/walking because the center of mass was far to the back.
Our work was based on (SpotMicro)[https://github.com/michaelkubina/SpotMicroESP32] and (Rex Gym)[https://github.com/nicrusso7/rex-gym]. Our GitHub is (here)[https://github.com/LSaldyt/laser-dog]
gpt-2-simple
-
Show HN: WhatsApp-Llama: A clone of yourself from your WhatsApp conversations
Tap the contact's name in WhatsApp (I think it only works on a phone) and at the bottom of that screen there's Export Chat.
For finetuning GPT-2 I think I used this thing on Google Colab. (My friend ran it on his GPU, it should be doable on most modern-ish GPUs.)
https://github.com/minimaxir/gpt-2-simple
I tried doing something with this a few months ago though and it was a bit of a hassle to get running (needed to use a specific python version for some dependencies...), I forget the details sorry!
-
indistinguishable
I mentioned in a different reply that I used https://github.com/minimaxir/gpt-2-simple
-
training gpt on your own sources - how does it work? gpt2 v gpt3? and how much does it cost?
You will need a few hundred bucks, python experience, and a simple implementation such as this repo https://github.com/minimaxir/gpt-2-simple
-
Training GPT-2 with HuggingFace Transformers to sound like a certain author
gpt_2_simple is your best bet! Its super easy to use, you just need to downgrade TensorFlow and some other packages in your environment.
-
I trained GPT-2 on Heidegger texts and am proud to release a WORLD FIRST: the full text of the sequel to Being and Time: Being and Time 2.
It's pretty easy - https://github.com/minimaxir/gpt-2-simple
What are some alternatives?
Style-Transfer-in-Text - Paper List for Style Transfer in Text
textgenrnn - Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.
pybullet-gym - Open-source implementations of OpenAI Gym MuJoCo environments for use with the OpenAI Gym Reinforcement Learning Research Platform.
robot-gym - RL applied to robotics.
drl_grasping - Deep Reinforcement Learning for Robotic Grasping from Octrees
stable-baselines3 - PyTorch version of Stable Baselines, reliable implementations of reinforcement learning algorithms.
ctrl-sum - Resources for the "CTRLsum: Towards Generic Controllable Text Summarization" paper
PILCO - Bayesian Reinforcement Learning in Tensorflow
gretel-synthetics - Synthetic data generators for structured and unstructured text, featuring differentially private learning.
ur_openai_gym - OpenAI Gym interface for Universal Robots with ROS Gazebo
gym-battleship - Battleship environment for reinforcement learning tasks
spot_mini_mini - Dynamics and Domain Randomized Gait Modulation with Bezier Curves for Sim-to-Real Legged Locomotion.