SymbolicNumericIntegration.jl
MuladdMacro.jl
SymbolicNumericIntegration.jl | MuladdMacro.jl | |
---|---|---|
1 | 3 | |
113 | 45 | |
0.0% | - | |
7.3 | 6.3 | |
2 days ago | 24 days ago | |
Julia | Julia | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
SymbolicNumericIntegration.jl
-
[2201.12468] Symbolic-Numeric Integration of Univariate Expressions based on Sparse Regression
The repository associated with this paper is https://github.com/SciML/SymbolicNumericIntegration.jl.
MuladdMacro.jl
-
Std: Clamp generates less efficient assembly than std:min(max,std:max(min,v))
Totally agreed. In Julia we use https://github.com/SciML/MuladdMacro.jl all over the place so that way it's contextual and does not bleed into other functions. fast-math changing everything is just... dangerous.
-
Someone’s Been Messing with My Subnormals
But if what you want is automatic FMA, then why carry along every other possible behavior with it? Just because you want FMA, suddenly NaNs are turned into Infs, subnormal numbers go to zero, handling of sin(x) at small values is inaccurate, etc? To me that's painting numerical handling in way too broad of strokes. FMA also only increases numerical accuracy, it doesn't decrease numerical accuracy, so bundling it with unsafe transformations makes one uncertain now whether it has improved or decreased accuracy.
For reference, to handle this well we use MuladdMacro.jl which is a semantic transformation that turns x*y+z into muladd expressions, and it does not recurse into functions so it does not change the definitions of the callers inside of the macro scope.
https://github.com/SciML/MuladdMacro.jl
This is something that will always increase performance and accuracy (performance because muladd in Julia is an FMA that is only applied if hardware FMA exists, effectively never resorting to a software FMA emulation) because it's targeted to do only a transformation that has that property.
- Julia macros
What are some alternatives?
SymbolicRegression.jl - Distributed High-Performance Symbolic Regression in Julia
Catalyst.jl - Chemical reaction network and systems biology interface for scientific machine learning (SciML). High performance, GPU-parallelized, and O(1) solvers in open source software.
ParameterizedFunctions.jl - A simple domain-specific language (DSL) for defining differential equations for use in scientific machine learning (SciML) and other applications
DataDrivenDiffEq.jl - Data driven modeling and automated discovery of dynamical systems for the SciML Scientific Machine Learning organization
JuMP.jl - Modeling language for Mathematical Optimization (linear, mixed-integer, conic, semidefinite, nonlinear)
ModelingToolkit.jl - An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
Unityper.jl