ai
deep-significance
ai | deep-significance | |
---|---|---|
6 | 6 | |
19 | 316 | |
- | - | |
3.5 | 4.0 | |
about 1 month ago | 7 months ago | |
Python | Python | |
MIT License | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ai
-
Made the YouTube Series Implementing ML Models Using NumPy
GitHub (for model impls and other series): https://github.com/oniani/ai
-
[D] What advanced models would you like to see implemented from scratch?
All of the videos are and will be available on my YouTube channel. Implementations are and will be in this GitHub repo.
-
[N] I Have Released the YouTube Series Discussing and Implementing Activation Functions
GitHub: https://github.com/oniani/ai
-
Implementing Logistic Regression from Scratch
Link to the YouTube video: https://www.youtube.com/watch?v=YDa3rX9yLCE Link to the repo containing the code: https://github.com/oniani/ai
-
[N] AI/ML Model API Design, Numerical Stability, and More Models from Scratch! (stylepoint)
Repository for the AI/ML series - oniani/ai.
-
Implementing Machine Learning Models From Scratch (stylepoint)
Thanks! One thing to note about that implementation is that we could have passed features and labels directly to the fit method. This would avoid unnecessary data copying (i.e., storing data inside the LinearRegression class). I have already updated the GitHub codebase.
deep-significance
- [P] deep-significance: Enabling easy statistical significance testing for deep neural networks
-
[D] Statistical Significance in Deep RL Papers: What is going on?
Because I was so frustrated by this topics as well, I actually reimplemented and packaged a test specifically for NNs and gave it a lot of documentation in the hope of lowering the entry barrier as much as possible https://github.com/Kaleidophon/deep-significance
- deep-significance: Easy and Better Significance Testing for Deep Neural Networks
- [P] deep-significance: Easy and Better Significance Testing for Deep Neural Networks
- [Project] deep-significance: Easy and Better Significance Testing for Deep Neural Networks (link below)
- [P] deep-significance: Easy and Better Significance Testing for Deep Neural Networks (link below)
What are some alternatives?
nannyml - nannyml: post-deployment data science in python
Note - Easily implement parallel training and distributed training. Machine learning library. Note.neuralnetwork.tf package include Llama2, Llama3, Gemma, CLIP, ViT, ConvNeXt, BEiT, Swin Transformer, Segformer, etc, these models built with Note are compatible with TensorFlow and can be trained with TensorFlow.
ludwig - Low-code framework for building custom LLMs, neural networks, and other AI models
horovod - Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet. [Moved to: https://github.com/horovod/horovod]
openrec - OpenRec is an open-source and modular library for neural network-inspired recommendation algorithms
horovod - Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
clearml - ClearML - Auto-Magical CI/CD to streamline your AI workload. Experiment Management, Data Management, Pipeline, Orchestration, Scheduling & Serving in one MLOps/LLMOps solution