Distributions.jl
Petalisp
Our great sponsors
Distributions.jl | Petalisp | |
---|---|---|
6 | 17 | |
1,070 | 423 | |
0.9% | - | |
7.6 | 8.5 | |
3 days ago | about 1 month ago | |
Julia | Common Lisp | |
GNU General Public License v3.0 or later | GNU Affero General Public License v3.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Distributions.jl
-
Yann Lecun: ML would have advanced if other lang had been adopted versus Python
If you look at Julia open source projects you'll see that the projects tend to have a lot more contributors than the Python counterparts, even over smaller time periods. A package for defining statistical distributions has had 202 contributors (https://github.com/JuliaStats/Distributions.jl), etc. Julia Base even has had over 1,300 contributors (https://github.com/JuliaLang/julia) which is quite a lot for a core language, and that's mostly because the majority of the core is in Julia itself.
This is one of the things that was noted quite a bit at this SIAM CSE conference, that Julia development tends to have a lot more code reuse than other ecosystems like Python. For example, the various machine learning libraries like Flux.jl and Lux.jl share a lot of layer intrinsics in NNlib.jl (https://github.com/FluxML/NNlib.jl), the same GPU libraries (https://github.com/JuliaGPU/CUDA.jl), the same automatic differentiation library (https://github.com/FluxML/Zygote.jl), and of course the same JIT compiler (Julia itself). These two libraries are far enough apart that people say "Flux is to PyTorch as Lux is to JAX/flax", but while in the Python world those share almost 0 code or implementation, in the Julia world they share >90% of the core internals but have different higher levels APIs.
If one hasn't participated in this space it's a bit hard to fathom how much code reuse goes on and how that is influenced by the design of multiple dispatch. This is one of the reasons there is so much cohesion in the community since it doesn't matter if one person is an ecologist and the other is a financial engineer, you may both be contributing to the same library like Distances.jl just adding a distance function which is then used in thousands of places. With the Python ecosystem you tend to have a lot more "megapackages", PyTorch, SciPy, etc. where the barrier to entry is generally a lot higher (and sometimes requires handling the build systems, fun times). But in the Julia ecosystem you have a lot of core development happening in "small" but central libraries, like Distances.jl or Distributions.jl, which are simple enough for an undergrad to get productive in a week but is then used everywhere (Distributions.jl for example is used in every statistics package, and definitions of prior distributions for Turing.jl's probabilistic programming language, etc.).
-
Don't waste your time on Julia
...so the blog post you've posted 4 times contains a list of issues the author filed in 2020-2021... and at least for the handful I clicked, they indeed have (long) been sorted. e.g., Filed Dec 18th 2020, closed Dec 20th
-
Julia ranks in the top most loved programming languages for 2022
Well, out of the issues mentioned, the ones still open can be categorized as (1) aliasing problems with mutable vectors https://github.com/JuliaLang/julia/issues/39385 https://github.com/JuliaLang/julia/issues/39460 (2) not handling OffsetArrays correctly https://github.com/JuliaStats/StatsBase.jl/issues/646, https://github.com/JuliaStats/StatsBase.jl/issues/638, https://github.com/JuliaStats/Distributions.jl/issues/1265 https://github.com/JuliaStats/StatsBase.jl/issues/643 (3) bad interaction of buffering and I/O redirection https://github.com/JuliaLang/julia/issues/36069 (4) a type dispatch bug https://github.com/JuliaLang/julia/issues/41096
So if you avoid mutable vectors and OffsetArrays you should generally be fine.
As far as the argument "Julia is really buggy so it's unusable", I think this can be made for any language - e.g. rand is not random enough, Java's binary search algorithm had an overflow, etc. The fixed issues have tests added so they won't happen again. Maybe copying the test suites from libraries in other languages would have caught these issues earlier, but a new system will have more bugs than a mature system so some amount of bugginess is unavoidable.
- The Julia language has a number of correctness flaws
-
Does a Julia package have to live in a separate file?
See the Distributions.jl package for an example .jl file structure: https://github.com/JuliaStats/Distributions.jl/tree/master/src
-
Organizing a Julia program
Structure your program around your domain specific constrains, e.g if you look at Distributions.jl they have folders for univariate/multivariate or discrete/continuous with a file per distribution containing the struct + all its methods :
Petalisp
- Petalisp: Elegant High Performance Computing
- Is there a tutorial for automatic differentiation with petalisp?
-
Is there a language with lisp syntax but C semantics?
While not "as fast as C" (C is not the absolute pinnacle of performance), Common Lisp is incredibly fast compared to the majority of programming languages around today. There is even a huge amount of ongoing work being done to make it faster still. We are seeing many interesting projects that make better use of the hardware in your computer (e.g. https://github.com/marcoheisig/Petalisp).
-
Common Lisp Implementations in 2023
i think lisp-stat library is actually being developed. however one numerical cl library that doesnt get enough mention and is being constantly developed is petalisp for HPC
https://github.com/marcoheisig/Petalisp
-
numericals - Performance of NumPy with the goodness of Common Lisp
However, if you have a lisp library that puts those semantics to use, then you could get it to employ magicl/ext-blas and cl-bmas to speed it up. (petalisp looks relevant, but I lack the background to compare it with APL.)
-
New Lisp-Stat Release
> his means cl pagckages can be "done".
this is true if there is nothing functional that can be added to a package. however its very much not true for ml frameworks right now. new things are being added all the time in the field. however even in the package i linked you have the necessary ingredients for any deep learning model: cuda and back propagation. the other person mentioned convolution which i think is pretty trivial to implement but still, if you expect everything for you to be ready made then you should probably stick to tf and pytorch. if you want to explore the cutting edge and push the boundaries then i think common lisp is a good tool. as an aside it might also be interesting to note that a common lisp package (Petalisp) is being used for high performance computing by a german university
https://github.com/marcoheisig/Petalisp
- The Julia language has a number of correctness flaws
-
When a young programmer who has been using C for several years is convinced that C is the best possible programming language and that people who don't prefer it just haven't use it enough, what is the best argument for Lisp vs C, given that they're already convinced in favor of C?
One trick is that Common Lisp can generate and compile code at runtime, whereas static languages typically do not have a compiler available at runtime. This lets you make your own lazy person's JIT/staged compiler, which is useful if some part of the problem is not known at compile-time. Such an approach has been used at least for array munging, type munging and regular expression munging.
What are some alternatives?
MLJ.jl - A Julia machine learning framework
awesome-cl - A curated list of awesome Common Lisp frameworks, libraries and other shiny stuff.
HypothesisTests.jl - Hypothesis tests for Julia
JWM - Cross-platform window management and OS integration library for Java
Optimization.jl - Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
cl-cuda - Cl-cuda is a library to use NVIDIA CUDA in Common Lisp programs.
StatsBase.jl - Basic statistics for Julia
magicl - Matrix Algebra proGrams In Common Lisp.
Lux.jl - Explicitly Parameterized Neural Networks in Julia
lish - Lisp Shell
StaticLint.jl - Static Code Analysis for Julia