diff-zoo | Zygote.jl | |
---|---|---|
4 | 9 | |
772 | 1,442 | |
- | 0.4% | |
0.0 | 8.1 | |
almost 3 years ago | about 1 month ago | |
Julia | Julia | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
diff-zoo
-
How do you generally go about finding the Jacobian of a complicated observation function for a Kalman filter?
To add to software for #1, look for packages that do forward-mode AD. For Jacobians (many outputs), forward-mode AD is often faster than back-propagation (reverse mode). I use ForwardDiff.jl which usually accepts any function the user might bass in as observation function https://github.com/baggepinnen/LowLevelParticleFilters.jl/blob/master/src/ekf.jl#L45 Here's a nice intro to forward and reverse mode ad https://github.com/MikeInnes/diff-zoo/blob/notebooks/backandforth.ipynb It's the second notebook in a series, might want to read the first as well if you find this topic interesting.
-
Ask HN: What are some examples of elegant software?
This is an obscure one, but Mike Innes "[automatic] differentiation for hackers" tutorial. It's a code tutorial, not software, if that counts. Both the way it's constructed and the functionality of Julia that gets shown off here.
https://github.com/MikeInnes/diff-zoo
- Neural networks with automatic differentiation.
-
[D] Gradient Tape Implementation
The following repos show how to build a simple reverse mode implementation from scratch: https://github.com/MikeInnes/diff-zoo
Zygote.jl
-
Yann Lecun: ML would have advanced if other lang had been adopted versus Python
If you look at Julia open source projects you'll see that the projects tend to have a lot more contributors than the Python counterparts, even over smaller time periods. A package for defining statistical distributions has had 202 contributors (https://github.com/JuliaStats/Distributions.jl), etc. Julia Base even has had over 1,300 contributors (https://github.com/JuliaLang/julia) which is quite a lot for a core language, and that's mostly because the majority of the core is in Julia itself.
This is one of the things that was noted quite a bit at this SIAM CSE conference, that Julia development tends to have a lot more code reuse than other ecosystems like Python. For example, the various machine learning libraries like Flux.jl and Lux.jl share a lot of layer intrinsics in NNlib.jl (https://github.com/FluxML/NNlib.jl), the same GPU libraries (https://github.com/JuliaGPU/CUDA.jl), the same automatic differentiation library (https://github.com/FluxML/Zygote.jl), and of course the same JIT compiler (Julia itself). These two libraries are far enough apart that people say "Flux is to PyTorch as Lux is to JAX/flax", but while in the Python world those share almost 0 code or implementation, in the Julia world they share >90% of the core internals but have different higher levels APIs.
If one hasn't participated in this space it's a bit hard to fathom how much code reuse goes on and how that is influenced by the design of multiple dispatch. This is one of the reasons there is so much cohesion in the community since it doesn't matter if one person is an ecologist and the other is a financial engineer, you may both be contributing to the same library like Distances.jl just adding a distance function which is then used in thousands of places. With the Python ecosystem you tend to have a lot more "megapackages", PyTorch, SciPy, etc. where the barrier to entry is generally a lot higher (and sometimes requires handling the build systems, fun times). But in the Julia ecosystem you have a lot of core development happening in "small" but central libraries, like Distances.jl or Distributions.jl, which are simple enough for an undergrad to get productive in a week but is then used everywhere (Distributions.jl for example is used in every statistics package, and definitions of prior distributions for Turing.jl's probabilistic programming language, etc.).
-
How long till Julia could be the default language to learn ML?
I think julia has a lot going for it. I feel like autograd is one of the bigger ones given that it's a language feature basically (https://github.com/FluxML/Zygote.jl for reference). I think the ecosystem is a bit of an uphill battle though.
-
Neural networks with automatic differentiation.
Also check out https://github.com/FluxML/Zygote.jl which is the AD engine
-
PyTorch 1.8 release with AMD ROCm support
> There's sadly no performant autodiff system for general purpose Python.
Like there is for general purpose Julia? (https://github.com/FluxML/Zygote.jl)
-
The KimKlone Microcomputer
Thanks again. Like you said it is fun to dream (ask the "Scheme Machine" guys sometime about how they would go about it now), but practically with technology like Julia's Zygote:
https://github.com/FluxML/Zygote.jl
the efficiency of autodiff might be similar to that of an opcode anyway.
So, how did DEC do on the Alpha processor? I always heard good things about it--IIRC it was based on the VAX, but 64 bit. I learned PDP-11 assembler at RPI, during their college program for high school students in about 1984. We hand assembled code and really got to know the architecture.
- FluxML/Zygote.jl -- v0.6.3 should implement a `jacobian` function but doesn't?
-
Did the makers of Zygote.jl use category theory to define their approach to computable autodiff?
and make that computable. It seems like line 88 --> 90 of this file in Zygote does that: https://github.com/FluxML/Zygote.jl/blob/master/src/compiler/chainrules.jl
- Study group: Structure and Interpretation of Classical Mechanics in Clojure
-
Ask HN: Show me your Half Baked project
It's super powerful
For example Zygote.jl (https://github.com/FluxML/Zygote.jl) implements reverse mode automatic differentiation, by defining a function that is a generated transformation of the function being differentiated.
What are some alternatives?
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
Enzyme - High-performance automatic differentiation of LLVM and MLIR.
ganja.js - :triangular_ruler: Javascript Geometric Algebra Generator for Javascript, c++, c#, rust, python. (with operator overloading and algebraic literals) -
ForwardDiff.jl - Forward Mode Automatic Differentiation for Julia
src - Read-only git conversion of OpenBSD's official CVS src repository. Pull requests not accepted - send diffs to the tech@ mailing list.
Tullio.jl - ⅀
nexus - A Nim web framework with batteries included
TensorFlow.jl - A Julia wrapper for TensorFlow
InvertibleNetworks.jl - A Julia framework for invertible neural networks
Yao.jl - Extensible, Efficient Quantum Algorithm Design for Humans.
syntaxdot - Neural syntax annotator, supporting sequence labeling, lemmatization, and dependency parsing.