indaba-pracs-2022
neural-tangents
indaba-pracs-2022 | neural-tangents | |
---|---|---|
1 | 4 | |
172 | 2,221 | |
0.6% | 0.5% | |
0.0 | 7.6 | |
30 days ago | 2 months ago | |
Jupyter Notebook | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
indaba-pracs-2022
-
From Deep Learning Foundations to Stable Diffusion
This year's Deep Learning Indaba had a tutorial on diffusion models in Jax: https://github.com/deep-learning-indaba/indaba-pracs-2022/tr...
neural-tangents
-
Any Deep ReLU Network Is Shallow
is used to capture the power of a fully-trained deep net of infinite width.
https://openreview.net/pdf?id=rkl4aESeUH, https://github.com/google/neural-tangents
> It has long been known that a single-layer fully-connected neural network with an i.i.d. prior over its parameters is equivalent to a Gaussian process (GP), in the limit of infinite network width.
https://arxiv.org/abs/1711.00165
And of course, one needs to look back at SVMs applying a kernel function and separating with a line, which looks a lot like an ANN with a single hidden layer followed by a linear mapping.
https://stats.stackexchange.com/questions/238635/kernel-meth...
-
[R] Training Machine Learning Models More Efficiently with Dataset Distillation
Code for https://arxiv.org/abs/2011.00050 found: https://github.com/google/neural-tangents
-
[D] Relationship Between Kernels, Neural Networks and Gaussian Process
I saw that you asked about neural tangent kernels (NTK) in another post yesterday -- be aware that what you're referencing in the present post are "neural network gaussian processes" (NNGP), which is distinct from NTK! The README of https://github.com/google/neural-tangents should help lift confusion. (I also took the term NNGP from there.)
-
[D] neural tangent kernel
It's true! There have been dozens of papers published on this topic, some of which are listed here: https://github.com/google/neural-tangents#references
What are some alternatives?
PyCBC-Tutorials - Learn how to use PyCBC to analyze gravitational-wave data and do parameter inference.
pymc-resources - PyMC educational resources
jaxrl - JAX (Flax) implementation of algorithms for Deep Reinforcement Learning with continuous action spaces.
eigenlearning - codebase for "A Theory of the Inductive Bias and Generalization of Kernel Regression and Wide Neural Networks"
mango - Parallel Hyperparameter Tuning in Python
bodywork-pymc3-project - Serving Uncertainty with Bayesian inference, using PyMC3 with Bodywork
Bayesian-Optimization-in-FSharp - Bayesian Optimization via Gaussian Processes in F#
brax - Massively parallel rigidbody physics simulation on accelerator hardware.
hyper-nn - Easy Hypernetworks in Pytorch and Jax