Our great sponsors
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
JuMP.jl
Modeling language for Mathematical Optimization (linear, mixed-integer, conic, semidefinite, nonlinear)
If you look at the thread for your first reference, there were a large number of performance improvements suggested that resulted in a 30x speedup when combined. I'm not sure what you're looking at for your second link, but Julia is faster than Lisp in n-body, spectral norm, mandelbrot, pidigits, regex, fasta, k-nucleotide, and reverse compliment benchmarks. (8 out of 10). For Julia going faster than C/Fortran, I would direct you to https://github.com/JuliaLinearAlgebra/Octavian.jl which is a julia program that beats MKL and openblas for matrix multiplication (which is one of the most heavily optimized algorithms in the world).
For the much faster to the other program, see https://github.com/jakobnissen/prechelt_benchmark/blob/master/v2.jl (mentioned https://discourse.julialang.org/t/help-to-get-my-slow-julia-code-to-run-as-fast-as-rust-java-lisp/65741/87)
A 100+ contributor project
Yes there are 3-5 different automatic differentiation implementations focusing on different algorithms and types of codes to differentiate. However if such a circumstance are discovered the Julia community tends to jointly implement abstractions. The first one was chainrules which implement the rules for derivatives of mathematical functions (how to calculate the derivative of the gamma function) in a shared place. The next step is https://github.com/JuliaDiff/AbstractDifferentiation.jl which unifies the different algorithms.