Symbolics.jl
Metatheory.jl
Symbolics.jl | Metatheory.jl | |
---|---|---|
13 | 5 | |
1,291 | 334 | |
1.0% | 1.2% | |
9.4 | 8.1 | |
4 days ago | 6 days ago | |
Julia | Julia | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Symbolics.jl
- Symbolics.jl
-
What packages would you like Julia to have?
It’s not up to parity with SymPy/Matlab by far yet - here’s the tracking issue on it https://github.com/JuliaSymbolics/Symbolics.jl/issues/59
- Converting Symbolics.jl Objects to SymPy.jl Objects
-
Error With StaticArrays Module & Symbolics.jl
Hello Juila Community. This is my second day working with Julia, having come over from Sympy due to performance reasons. I am working on a project that requires calculating matrix determinants and adjugates for families of matrices with symbolics entries. I am using Symbolics.jl for the symbols and using Juilia 1.8.2.
- ModelingToolkit over Modelica
-
A Mature Library For Symbolic Computation?
After spending some time reading the documentation, it turns out that JuliaSymbolics also lacks factorizations functionality (according to [Link](https://github.com/JuliaSymbolics/Symbolics.jl/issues/59))
-
Looking for numerical/iterative approach for determining a value
You can also get an expression for the partial of β with respect to h using Symbolics.jl:
-
In 2022, the difference between symbolic computing and compiler optimizations will be erased in #julialang. Anyone who can come up with a set of symbolic mathematical rules will automatically receive an optimized compiler pass to build better code
The example is applied to the right-hand side of a generated mass-matrix ODE (DAE) which is then solved using the adaptive time stepping methods of DifferentialEquations.jl. It's a test example that comes from the robotics / rigid body dynamics simulation groups (specifically interested in control) where they before were generating the governing equations with SymPy, and recently switched to try Symbolics.jl (and we got the example because of some performance issues that needed fixing). The comparison is with and without applying the code simplifier before solving. The table shows an average global induced error of 1e-12 when chopping off the 1e-11 * sin(x) terms and smaller. Thus there's nothing "competitive" against standard adaptive time stepping here: it's used to enhance the simulation of generated models that are simulated with the adaptive time steppers.
- From Julia to Rust
-
Fractions in Julia Symbolics
Done. https://github.com/JuliaSymbolics/Symbolics.jl/issues/215
Metatheory.jl
-
[ANN] E-graphs and equality saturation: hegg 0.1
I'd love to see something in the lines of Julia's https://juliasymbolics.github.io/Metatheory.jl/dev/
-
Twitter Thread: Symbolic Computing for Compiler Optimizations in Julia
From that example you can see how this makes some rather difficult compiler questions all be subsumed in the e-graph saturation solve. That solve itself isn't easy, it's an NP-hard problem that requires good heuristics and such, and that's what Metatheory.jl, and that's what chunks of the thesis are about. But given a good enough solver, the ability to write such transformation passes becomes rather trivial and you get an optimal solution in the sense of the chosen cost function. So problems like enabling automatic FMA on specific codes is rather simple with this tool: just declare a*b + c = fma(a,b,c), the former is a cost of 2 the latter is a cost of one, and let it rip.
-
Show HN: prometeo – a Python-to-C transpiler for high-performance computing
Well IMO it can definitely be rewritten in Julia, and to an easier degree than python since Julia allows hooking into the compiler pipeline at many areas of the stack. It's lispy an built from the ground up for codegen, with libraries like (https://github.com/JuliaSymbolics/Metatheory.jl) that provide high level pattern matching with e-graphs. The question is whether it's worth your time to learn Julia to do so.
You could also do it at the LLVM level: https://github.com/JuliaComputingOSS/llvm-cbe
For interesting takes on that, you can see https://github.com/JuliaLinearAlgebra/Octavian.jl which relies on loopvectorization.jl to do transforms on Julia AST beyond what LLVM does. Because of that, Octavian.jl beats openblas on many linalg benchmarks
- From Julia to Rust
- Algebraic Metaprogramming in Julia with Metatheory.jl
What are some alternatives?
julia - The Julia Programming Language
JET.jl - An experimental code analyzer for Julia. No need for additional type annotations.
Octavian.jl - Multi-threaded BLAS-like library that provides pure Julia matrix multiplication
Dagger.jl - A framework for out-of-core and parallel execution
ModelingToolkit.jl - An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
MacroTools.jl - MacroTools provides a library of tools for working with Julia code and expressions.
fricas - Official repository of the FriCAS computer algebra system
acados - Fast and embedded solvers for nonlinear optimal control
egg - egg is a flexible, high-performance e-graph library
SciMLBenchmarks.jl - Scientific machine learning (SciML) benchmarks, AI for science, and (differential) equation solvers. Covers Julia, Python (PyTorch, Jax), MATLAB, R