Our great sponsors
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
Oceananigans.jl
🌊 Julia software for fast, friendly, flexible, ocean-flavored fluid dynamics on CPUs and GPUs
> abstract blog posts
If you refer to the blog post that made Top HN yesterday, it is very much backed by actual experience (https://nyxt.atlas.engineer/) and quite a load of code (https://github.com/atlas-engineer/nyxt/tree/master/source).
Their Github page has it: https://github.com/weymouth/WaterLily.jl
The release was just cut 9 hours ago, as shown on the releases part of the Github page (https://github.com/JuliaLang/julia/releases/tag/v1.9.0). That then starts the jobs for the creation and deployment of the final binaries, and when that's done the Julialang.org website gets updated to state it's the release, and when that's done the blog post for the new release goes out. You can even follow the last step of the process here (https://github.com/JuliaLang/www.julialang.org/pull/1875), since it all occurs on the open source organization.
The release was just cut 9 hours ago, as shown on the releases part of the Github page (https://github.com/JuliaLang/julia/releases/tag/v1.9.0). That then starts the jobs for the creation and deployment of the final binaries, and when that's done the Julialang.org website gets updated to state it's the release, and when that's done the blog post for the new release goes out. You can even follow the last step of the process here (https://github.com/JuliaLang/www.julialang.org/pull/1875), since it all occurs on the open source organization.
You may be confusing front end APIs and the compiler backends.
Julia is flexible enough that you can essentially define domain specific languages within Julia for certain applications. In this case, we are using Julia as an abstract front end and then deferring the concrete interface to vendor specific GPU compilation drivers. Part of what permits this is that Julia is a LLVM front end and many of the vendor drivers include LLVM-based backends. With some transformation of the Julia abstract syntax tree and the LLVM IR we can connect the two.
That said we are mostly dependent on vendors providing the backend compiler technology. When they do, we can bridge Julia to use that interface. We can wrap Vulkan and technologies like oneAPI.
https://github.com/JuliaGPU/Vulkan.jl
https://github.com/JuliaGPU/oneAPI.jl
As for syntax, Julia syntax scales from a scripting language to a fully typed language. You can write valid and performant code without specifying any types, but you can also specialize methods for specific types. The type notation uses `::`. The types also have parameters in the curly brackets. The other aspect that makes this specific example complicated is the use of Lisp-like macros which starts with `@`. These allow for code transformation as I described earlier. The last aspect is that the author is making extensive use of Unicode. This is purely optional as you can write Julia with just ASCII. Some authors like to use `ε` instead of `in`.
It turns out that Julia is ~a lisp, just with a weird syntax. If you look at the metaprogramming facilities, all expressions are first turned into s-exprs while parsing. There is no problem having a LISP syntax for Julia, and in fact this has been implemented! (https://github.com/swadey/LispSyntax.jl)
https://docs.julialang.org/en/v1/manual/metaprogramming/
I‘m currently playing around with Oceananigans.jl (https://github.com/CliMA/Oceananigans.jl). Do you know how both are similar or different?
Oceananigans.jl has really intuitive step-by-step examples and a great discussion page on GitHub.
https://github.com/odsl-team/julia-ml-from-scratch/issues/2
Summarizing, they benchmark some machine learning code that uses KernelAbstractions.jl on different platforms and find: