neural-tangents
eigenlearning
neural-tangents | eigenlearning | |
---|---|---|
4 | 5 | |
2,225 | 49 | |
0.6% | - | |
7.6 | 3.3 | |
2 months ago | about 1 year ago | |
Jupyter Notebook | Jupyter Notebook | |
Apache License 2.0 | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
neural-tangents
-
Any Deep ReLU Network Is Shallow
is used to capture the power of a fully-trained deep net of infinite width.
https://openreview.net/pdf?id=rkl4aESeUH, https://github.com/google/neural-tangents
> It has long been known that a single-layer fully-connected neural network with an i.i.d. prior over its parameters is equivalent to a Gaussian process (GP), in the limit of infinite network width.
https://arxiv.org/abs/1711.00165
And of course, one needs to look back at SVMs applying a kernel function and separating with a line, which looks a lot like an ANN with a single hidden layer followed by a linear mapping.
https://stats.stackexchange.com/questions/238635/kernel-meth...
-
[R] Training Machine Learning Models More Efficiently with Dataset Distillation
Code for https://arxiv.org/abs/2011.00050 found: https://github.com/google/neural-tangents
-
[D] Relationship Between Kernels, Neural Networks and Gaussian Process
I saw that you asked about neural tangent kernels (NTK) in another post yesterday -- be aware that what you're referencing in the present post are "neural network gaussian processes" (NNGP), which is distinct from NTK! The README of https://github.com/google/neural-tangents should help lift confusion. (I also took the term NNGP from there.)
-
[D] neural tangent kernel
It's true! There have been dozens of papers published on this topic, some of which are listed here: https://github.com/google/neural-tangents#references
eigenlearning
-
Neural Architecture Search (NAS) [D]
In addition to using a validation set, there is research w.r.t. the neural tangent kernel which claims that metrics correlated to both training speed and generalization can be computed from the NTK. This would then remove/reduce the need for training as one can substitute it with computing the NTK. I don’t have a complete list of references, but here is one example and another where they have applied NTK (and number of linear regions) to NAS.
-
[R] Neural Tangent Kernel Eigenvalues Accurately Predict Generalization
Code for https://arxiv.org/abs/2110.03922 found: https://github.com/james-simon/eigenlearning
What are some alternatives?
pymc-resources - PyMC educational resources
first-order-model - This repository contains the source code for the paper First Order Motion Model for Image Animation
mango - Parallel Hyperparameter Tuning in Python
TensorFlow-Examples - TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2)
indaba-pracs-2022 - Notebooks for the Practicals at the Deep Learning Indaba 2022.
shap - A game theoretic approach to explain the output of any machine learning model.
Bayesian-Optimization-in-FSharp - Bayesian Optimization via Gaussian Processes in F#
fastai - The fastai deep learning library
hyper-nn - Easy Hypernetworks in Pytorch and Jax
timm-vis - Visualizer for PyTorch image models
google-research - Google Research