machine-learning-book
embedding-encoder
machine-learning-book | embedding-encoder | |
---|---|---|
2 | 1 | |
2,863 | 40 | |
- | - | |
6.8 | 0.0 | |
8 days ago | 9 months ago | |
Jupyter Notebook | Jupyter Notebook | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
machine-learning-book
-
Implementing a ChatGPT-like LLM from scratch, step by step
Sorry, in that case I would rather recommend a dedicated RL book. The RL part in LLMs will be very specific to LLMs, and I will only cover what's absolutely relevant in terms of background info. I do have a longish intro chapter on RL in my other general ML/DL book (https://github.com/rasbt/machine-learning-book/tree/main/ch1...) but like others said, I would recommend a dedicated RL book in your case.
-
"Machine Learning with PyTorch and Scikit-Learn" book
All the code examples are available here: https://github.com/rasbt/machine-learning-book
embedding-encoder
-
scikit-learn transformer that turns categorical variables into dense vector representations
Github: https://github.com/cpa-analytics/embedding-encoder
What are some alternatives?
skorch - A scikit-learn compatible neural network library that wraps PyTorch
machine_learning_complete - A comprehensive machine learning repository containing 30+ notebooks on different concepts, algorithms and techniques.
python-machine-learning-book-3rd-edition - The "Python Machine Learning (3rd edition)" book code repository
machine-learning-articles - 🧠💬 Articles I wrote about machine learning, archived from MachineCurve.com.
ML-Workspace - 🛠All-in-one web-based IDE specialized for machine learning and data science.
Fast-Transformer - An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow
gdrl - Grokking Deep Reinforcement Learning
hyperlearn - 2-2000x faster ML algos, 50% less memory usage, works on all hardware - new and old.
Traffic-Lights-Classification-CNN - Coding and testing a convolutional neural network for classifying traffic lights, with Keras and Tensorflow, using LISA dataset.