pyprobml
score_sde
Our great sponsors
pyprobml | score_sde | |
---|---|---|
3 | 6 | |
6,257 | 1,242 | |
1.7% | - | |
6.2 | 0.0 | |
4 months ago | over 1 year ago | |
Jupyter Notebook | Jupyter Notebook | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pyprobml
-
Best Possible Book Recommended for Machine Learning [Discussion] [D] [Recommendation]
Another great book is Kevin Murphy’s Machine Learning: A probabilistic approach. He just launched the second version of his book and he has a Python repo for the models and graphs: https://github.com/probml/pyprobml
-
Probabilistic Machine Learning, Kevin Murphy (2nd edition, 2021)
This exists actually, it's not complete yet (I think?) but it covers a lot of the material in the book:
https://github.com/probml/pyprobml
score_sde
- Ask HN: How to get back into AI?
-
[D] Variance of sampling in diffusion models
Perhaps the ODE interpretation would be helpful (see here and here) which turns DDPMs into neural ODEs using the Fokker-Planck equation so after the initial starting noise, the sampling process is deterministic. If samples are noisy even with the full number of steps then you might need to increase the number of steps further.
-
[D] Why is the diffution model so powerful? but the math behind it is so simple.
Turns out that diffusion models also define a certain differential equation, making it a neural ODE. Then you can just integrate the ODE in the other direction to get the exact inverse for the DDPM (it's not entirely exact b/c of numerical error in the solver, but close enough)
- [D] Are DDPMs a variation on Score Based Generative Modeling? Or is there a fundemental difference between the two?
-
Diffusion Models Beat GANs on Image Synthesis
This new approach to generative modelling looks very intriguing.
In a similar ilk, there's this ICLR paper from this year using stochastic differential equations for generative modelling: https://arxiv.org/abs/2011.13456
- [D] Efficient, concurrent input pipelines in JAX?
What are some alternatives?
numpyro - Probabilistic programming with NumPy powered by JAX for autograd and JIT compilation to GPU/TPU/CPU.
guided-diffusion
prml - Repository of notes, code and notebooks in Python for the book Pattern Recognition and Machine Learning by Christopher Bishop
pytorch-generative - Easy generative modeling in PyTorch.
jaxopt - Hardware accelerated, batchable and differentiable optimizers in JAX.
SDE - Example codes for the book Applied Stochastic Differential Equations
machine-learning-experiments - 🤖 Interactive Machine Learning experiments: 🏋️models training + 🎨models demo
Financial-Models-Numerical-Methods - Collection of notebooks about quantitative finance, with interactive python code.
lucid - A collection of infrastructure and tools for research in neural network interpretability.
Compositional-Visual-Generation-with-Composable-Diffusion-Models-PyTorch - [ECCV 2022] Compositional Generation using Diffusion Models
PRML - PRML algorithms implemented in Python
best-of-ml-python - 🏆 A ranked list of awesome machine learning Python libraries. Updated weekly.