interactive-gp-visualization
neural-tangents
interactive-gp-visualization | neural-tangents | |
---|---|---|
1 | 4 | |
158 | 2,225 | |
- | 0.6% | |
3.0 | 7.6 | |
about 1 year ago | 2 months ago | |
Svelte | Jupyter Notebook | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
interactive-gp-visualization
-
What does the wiggly animation in this visualization of samples from GPs represent?
The source code for the visualization is available, but I'm not at all familiar with Svelte or D3.js so figured to it would be much easier to just ask around first instead.
neural-tangents
-
Any Deep ReLU Network Is Shallow
is used to capture the power of a fully-trained deep net of infinite width.
https://openreview.net/pdf?id=rkl4aESeUH, https://github.com/google/neural-tangents
> It has long been known that a single-layer fully-connected neural network with an i.i.d. prior over its parameters is equivalent to a Gaussian process (GP), in the limit of infinite network width.
https://arxiv.org/abs/1711.00165
And of course, one needs to look back at SVMs applying a kernel function and separating with a line, which looks a lot like an ANN with a single hidden layer followed by a linear mapping.
https://stats.stackexchange.com/questions/238635/kernel-meth...
-
[R] Training Machine Learning Models More Efficiently with Dataset Distillation
Code for https://arxiv.org/abs/2011.00050 found: https://github.com/google/neural-tangents
-
[D] Relationship Between Kernels, Neural Networks and Gaussian Process
I saw that you asked about neural tangent kernels (NTK) in another post yesterday -- be aware that what you're referencing in the present post are "neural network gaussian processes" (NNGP), which is distinct from NTK! The README of https://github.com/google/neural-tangents should help lift confusion. (I also took the term NNGP from there.)
-
[D] neural tangent kernel
It's true! There have been dozens of papers published on this topic, some of which are listed here: https://github.com/google/neural-tangents#references
What are some alternatives?
pymc-resources - PyMC educational resources
eigenlearning - codebase for "A Theory of the Inductive Bias and Generalization of Kernel Regression and Wide Neural Networks"
mango - Parallel Hyperparameter Tuning in Python
indaba-pracs-2022 - Notebooks for the Practicals at the Deep Learning Indaba 2022.
Bayesian-Optimization-in-FSharp - Bayesian Optimization via Gaussian Processes in F#
hyper-nn - Easy Hypernetworks in Pytorch and Jax
timm-vis - Visualizer for PyTorch image models
google-research - Google Research