SumTypes.jl
Dagger.jl
SumTypes.jl | Dagger.jl | |
---|---|---|
2 | 4 | |
93 | 584 | |
- | 2.2% | |
7.8 | 8.9 | |
3 months ago | 3 days ago | |
Julia | Julia | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
SumTypes.jl
-
Enums in Rust – and why they feel better
An interesting aspect of sum types (what Rust calls enums) is that you can implement them in the language as a library if you have real unions, but not vice-versa.
Here's my example of sum types being implemented in julia as a regular package: https://github.com/MasonProtter/SumTypes.jl
-
From Julia to Rust
> Pattern matching
MLStyle.jl [1] is quite nice for this and has been around for a while.
> Tagged, closed unions
These are less general than 'real' unions and can be implemented using them. E.g. SumTypes.jl [2] has some macros to make it a bit more convenient to define them, it could use some other quality of life features though.
[1] https://thautwarm.github.io/MLStyle.jl/latest/syntax/pattern...
[2] https://github.com/MasonProtter/SumTypes.jl
Dagger.jl
- Dagger: a new way to build CI/CD pipelines
-
DTable a new distributed table implementation in Julia using Dagger.jl
Firstly, I'll say that we already have work started to implement out-of-core directly in Dagger: https://github.com/JuliaParallel/Dagger.jl/pull/289.
With that PR in place, it should be possible to define a "storage device" which is backed by a database. I haven't had a chance to actually try this, since the PR still needs quite some work and testing, but it's definitely something on my radar!
- From Julia to Rust
-
Cerebras’ New Monster AI Chip Adds 1.4T Transistors
I'm not sure that's necessarily the domain of a low-level package like CUDA.jl though (which I assume you're referring to). That kind of interface is more the domain of higher-level packages like https://github.com/JuliaParallel/Dagger.jl/ and to a lesser extent https://juliagpu.github.io/KernelAbstractions.jl/stable/. Moreover, the jury is still out on whether the built-in Distributed module is an ideal abstraction for every use-case (clusters, heterogeneous compute, etc.)
WRT Nx, my biggest question is how they'll crack the problem of still needing big balls of C++ and the shims everywhere to get acceleration. Creating a compiler that generates efficient GPU or other accelerator code is a massive research project with no clear winners, never mind the challenge of reconciling the very mutation-heavy needs of GPU compute with a mostly immutable language model.
What are some alternatives?
Octavian.jl - Multi-threaded BLAS-like library that provides pure Julia matrix multiplication
earthly - Super simple build framework with fast, repeatable builds and an instantly familiar syntax – like Dockerfile and Makefile had a baby.
StaticArrays.jl - Statically sized arrays for Julia
julia - The Julia Programming Language
Catlab.jl - A framework for applied category theory in the Julia language
DuckDB.jl
Symbolics.jl - Symbolic programming for the next generation of numerical software
determined - Determined is an open-source machine learning platform that simplifies distributed training, hyperparameter tuning, experiment tracking, and resource management. Works with PyTorch and TensorFlow.
egg - egg is a flexible, high-performance e-graph library
Metatheory.jl - Makes Julia reason with equations. General purpose metaprogramming, symbolic computation and algebraic equational reasoning library for the Julia programming language: E-Graphs & equality saturation, term rewriting and more.