Our great sponsors
-
ModelingToolkit.jl
An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
Bonus 1: And for example with packages like ModelingToolkit.jl you write an abstract model that is in a first step symbolically reduced and optimized for numerical stability, then automatically parallelized for the specific system architecture (cpu, gpu, distributed) and compiled. And the cool thing is that everything happens as julia code transformations, and no other low level language (besides LLVM very far down) is used.
-
Concerning the second point, there seem to be ways to get even faster code than by most carefully optimizing it by hand, which is code that is automatically adapted to both problem and hardware. Here are benchmark results for comparing matmul performance of the highly optimized packages openBLAS and MKL with automatic optimizations written in Julia. Octavian.jl reaches superior performance for small-medium matrix sizes and is similar for large matrices. more benchmarks
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.