Oceananigans.jl
Torch.jl
Our great sponsors
Oceananigans.jl | Torch.jl | |
---|---|---|
4 | 6 | |
875 | 204 | |
1.6% | 2.5% | |
9.5 | 4.2 | |
5 days ago | about 17 hours ago | |
Julia | Julia | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Oceananigans.jl
-
Julia 1.10 Released
I think it’s also the design philosophy. JuMP and ForwardDiff are great success stories and are packages very light on dependencies. I like those.
The DiffEq library seems to pull you towards the SciML ecosystem and that might not be agreeable to everyone.
For instance a known Julia project that simulates diff equations seems to have implemented their own solver
https://github.com/CliMA/Oceananigans.jl
-
GPU vendor-agnostic fluid dynamics solver in Julia
I‘m currently playing around with Oceananigans.jl (https://github.com/CliMA/Oceananigans.jl). Do you know how both are similar or different?
Oceananigans.jl has really intuitive step-by-step examples and a great discussion page on GitHub.
- Supercharged high-resolution ocean simulation with Jax
Torch.jl
- Julia 1.10 Released
- Julia 1.9: A New Era of Performance and Flexibility
- How usable is Julia for Natural Language Processing Machine learning?
-
Does Julia Have a Chance to Overthrown Python in the Machine Learning Industry?
For frontends Python has quite some head-start. In principle it would be possible to write Julia frond-ends to existing ML libraries (written e.g. in C), for example https://github.com/FluxML/Torch.jl , but the advantages over Python frontends would be very limited. Only a front-to-back Julia implementation leverages most of the language advantages like composibility and flexibility.
-
Julia: faster than Fortran, cleaner than Numpy
PyTorch for example is a C++ library with a Python user interface, see e.g. the language shares in GitHub (https://github.com/pytorch/pytorch ). There is also a Julia binding for Torch (https://github.com/FluxML/Torch.jl), but I do not know how up-to-date it is.
What are some alternatives?
MATDaemon.jl
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
FiniteDiff.jl - Fast non-allocating calculations of gradients, Jacobians, and Hessians with sparsity support
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
MITgcm - M.I.T General Circulation Model master code and documentation repository
gluon-nlp - NLP made easy
Metal.jl - Metal programming in Julia
SciPyDiffEq.jl - Wrappers for the SciPy differential equation solvers for the SciML Scientific Machine Learning organization
opendylan - Open Dylan compiler and IDE
JuliaTorch - Using PyTorch in Julia Language
julia-ml-from-scratch - Machine learning from scratch in Julia
threads - Threads for Lua and LuaJIT. Transparent exchange of data between threads is allowed thanks to torch serialization.