-
From the KAN repo itself, it appears they already have GPU support https://github.com/KindXiaoming/pykan/blob/master/tutorials/...
-
InfluxDB
Purpose built for real-time analytics at any scale. InfluxDB Platform is powered by columnar analytics, optimized for cost-efficient storage, and built with open data standards.
-
The tensor diagrams are not quite standard (yet). That's why I also include more "classical" neural network diagrams next to them.
I've recently been working on a library for doing automatic manipulation and differentiation of tensor diagrams (https://github.com/thomasahle/tensorgrad), and to me they are clearly a cleaner notation.
For a beautiful introduction to tensor networks, see also Jordan Taylor's blog post (https://www.lesswrong.com/posts/BQKKQiBmc63fwjDrj/graphical-...)
-
awesome-kan
A comprehensive collection of KAN(Kolmogorov-Arnold Network)-related resources, including libraries, projects, tutorials, papers, and more, for researchers and developers in the Kolmogorov-Arnold Network field.
This is sort of my view as well, most of the hype and the criticisms of KANs seem to be fairly unfounded.
I do think they have a lot of potential, but what has been published so far does not represent a panacea. Perhaps they will have an impact like transformers, perhaps they will only serve in a little niche. You can't really tell immediately how refinements will alter the usability.
Finding out what those refinements are and how they change things is what research is all about. I have been quite enjoying following https://github.com/mintisan/awesome-kan progress and seeing the variety of things being tried. I have a few ideas of my own I might try at sometime.
Between KANs and fixed activation function networks there is an entire continuum of activation function tuning available for research.
Buckets of simple parameter activation functions something like xsigmoid(mx) ( ReLU when m is large, GeLU at m=1.7, SiLU at m=1). This adds a small number of parameters for presumably some game
Single activation functions as above per neuron.
Multi parameterizable activation functions, in batches, per neuron.
Many parameter function approximators, in batches, per neuron.
Full KANs without weights.
I can see some significant acclaim being awarded to the person who can calculate a unified formula for determining where additional parameters should go for the largest impact.
-
Really detailed work. Thank you. For those looking to jump straight to code, here is the link to codebase discussed in the blog.
https://github.com/Ameobea/kan
-
milewski-ctfp-pdf
Bartosz Milewski's 'Category Theory for Programmers' unofficial PDF and LaTeX source
Category theory was a godsend from Milewski, esp. his PDF is great! https://bartoszmilewski.com/2014/10/28/category-theory-for-p...
Maybe you can add a little and offer ressources from your background?
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives