StaticLint.jl
diffrax
StaticLint.jl | diffrax | |
---|---|---|
4 | 21 | |
133 | 1,237 | |
1.5% | - | |
5.7 | 8.2 | |
about 1 month ago | 6 days ago | |
Julia | Python | |
GNU General Public License v3.0 or later | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
StaticLint.jl
-
Julia v1.9.0 has been released
Yes, tooling around this is being developed in the form of linters (e.g. https://github.com/julia-vscode/StaticLint.jl) and through real compiler integration tools like the very cool https://aviatesk.github.io/JET.jl/dev/ but this is definitely somewhere that the tooling in julia is weaker than in other languages. It seems to be picking up a lot of speed though.
-
The Julia language has a number of correctness flaws
It is correct if `A` is of type `Array` as normal Array in julia has 1-based indexing. It is incorrect if `A` is of some other type which subtypes `AbstractArray` as these may not follow 1-based indexing. But this case errors normally due to bounds checking. The OP talks about the case where even bounds checking is turned off using `@inbounds` for speed and thus silently giving wrong answers without giving an error.
An issue was created sometime ago in StaticLint.jl to fix this: https://github.com/julia-vscode/StaticLint.jl/issues/337
-
I created an Emacs package to statically lint Julia files (using StaticLint.jl)
Statically lint = find errors in the Julia file like using variables that are not defined, and functions with the wrong arguments. For Julia, StaticLint.jl is an actively developed library that does static linting. It basically provides a bunch of functions that spit out errors in your Julia file like those that I mentioned above. If you are an Emacs editor user, this project is like a "convenience" which will run Julia silently in the background, and communicate with it to extract errors in the file that you currently have open. These errors are then highlighted in your editor view using the Flycheck package that is one of the ways to highlight errors in Emacs.
diffrax
- Ask HN: What side projects landed you a job?
-
[P] Optimistix, nonlinear optimisation in JAX+Equinox!
Optimistix has high-level APIs for minimisation, least-squares, root-finding, and fixed-point iteration and was written to take care of these kinds of subroutines in Diffrax.
-
Show HN: Optimistix: Nonlinear Optimisation in Jax+Equinox
Diffrax (https://github.com/patrick-kidger/diffrax).
Here is the GitHub: https://github.com/patrick-kidger/optimistix
The elevator pitch is Optimistix is really fast, especially to compile. It
-
Scientific computing in JAX
Sure. So I've got some PyTorch benchmarks here. The main take-away so far has been that for a neural ODE, the backward pass takes about 50% longer in PyTorch, and the forward (inference) pass takes an incredible 100x longer.
-
[D] JAX vs PyTorch in 2023
FWIW this worked for me. :D My full-time job is now writing JAX libraries at Google. Equinox for neural networks, Diffrax for differential equation solvers, etc.
-
Returning to snake's nest after a long journey, any major advances in python for science ?
It's relatively early days yet, but JAX is in the process of developing its nascent scientific computing / scientific machine learning ecosystem. Mostly because of its strong autodifferentiation capabilities, excellent JIT compiler etc. (E.g. to show off one of my own projects, Diffrax is the library of diffeq solvers for JAX.)
-
What's the best thing/library you learned this year ?
Diffrax - solving ODEs with Jax and computing it's derivatives automatically functools - love partial and lru_cache fastprogress - simpler progress bar than tqdm
-
PyTorch 2.0
At least prior to this announcement: JAX was much faster than PyTorch for differentiable physics. (Better JIT compiler; reduced Python-level overhead.)
E.g for numerical ODE simulation, I've found that Diffrax (https://github.com/patrick-kidger/diffrax) is ~100 times faster than torchdiffeq on the forward pass. The backward pass is much closer, and for this Diffrax is about 1.5 times faster.
It remains to be seen how PyTorch 2.0 will compare, or course!
Right now my job is actually building out the scientific computing ecosystem in JAX, so feel free to ping me with any other questions.
-
Python 3.11 is much faster than 3.8
https://github.com/patrick-kidger/diffrax
Which are neural network and differential equation libraries for JAX.
[Obligatory I-am-googler-my-opinions-do-not-represent- your-employer...]
-
Ask HN: What's your favorite programmer niche?
Autodifferentiable programming!
Neural networks are the famous example of this, of course -- but this can be extended to all of scientific computing. ODE/SDE solvers, root-finding algorithms, LQP, molecular dynamics, ...
These days I'm doing all my work in JAX. (E.g. see Equinox or Diffrax: https://github.com/patrick-kidger/equinox, https://github.com/patrick-kidger/diffrax). A lot of modern work is now based around hybridising such techniques with neural networks.
I'd really encourage anyone interested to learn how JAX works under-the-hood as well. (Look up "autodidax") Lots of clever/novel ideas in its design.
What are some alternatives?
LanguageServer.jl - An implementation of the Microsoft Language Server Protocol for the Julia language.
deepxde - A library for scientific machine learning and physics-informed learning
julia-staticlint - Emacs integration for StaticLint.jl
tiny-cuda-nn - Lightning fast C++/CUDA neural network framework
Optimization.jl - Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
flax - Flax is a neural network library for JAX that is designed for flexibility.
StatsBase.jl - Basic statistics for Julia
juliaup - Julia installer and version multiplexer
dotfiles - Linux work environment setup
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
Distributions.jl - A Julia package for probability distributions and associated functions.
dm-haiku - JAX-based neural network library