SaaSHub helps you find the best software and product alternatives Learn more →
SciPyDiffEq.jl Alternatives
Similar projects and alternatives to SciPyDiffEq.jl
-
-
CodeRabbit
CodeRabbit: AI Code Reviews for Developers. Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR.
-
ffi-overhead
comparing the c ffi (foreign function interface) overhead on various programming languages
-
-
-
prima
PRIMA is a package for solving general nonlinear optimization problems without using derivatives. It provides the reference implementation for Powell's derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. PRIMA means Reference Implementation for Powell's methods with Modernization and Amelioration, P for Powell.
-
SciMLBenchmarks.jl
Scientific machine learning (SciML) benchmarks, AI for science, and (differential) equation solvers. Covers Julia, Python (PyTorch, Jax), MATLAB, R
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
PowerSimulationsDynamics.jl
Julia package to run Dynamic Power System simulations. Part of the Scalable Integrated Infrastructure Planning Initiative at the National Renewable Energy Lab.
-
-
Optimization.jl
Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
SciPyDiffEq.jl discussion
SciPyDiffEq.jl reviews and mentions
-
Building a compile-time SIMD optimized smoothing filter
Did you link the wrong script? The script you show runs everything to statistical significance using Chairmarks.@b.
Also, I don't understand what the issue would be with mixing Python and Julia code in the benchmark script. The Julia side JIT compiles the invocations which we've seen removes pretty much all non-Python overhead and actually makes the resulting SciPy calls faster than doing so from Python itself in many cases, see for example https://github.com/SciML/SciPyDiffEq.jl?tab=readme-ov-file#m... where invocations from Julia very handily outperform SciPy+Numba from the Python REPL. Of course, that is a higher order function so it's benefiting from the Julia JIT in other ways as well, but the point is in previous benchmarks we've seen the overhead floor as so low (~100ns IIRC) that it didn't effect benchmarks negatively for Python and actually improved many Python timings in practice. Though it would be good to show in this case what exactly the difference is in order to isolate any potential issue, I would be surprised if it's more than a 100ns overhead for an invocation like this and with 58ms being the benchmark size, that cost is well below the noise floor).
Though trying different datasets is of course valid. There's no reason to reject a benchmark just because it doesn't fit into L3 cache, there's many use cases for that. But it does not mean that all use cases are likely to see such a result.
-
Good linear algebra libraries
Check out the SciML ecosystem. They are doing amazing work in that space. You might also want to integrate your methods with their libraries, as it will boost their potential audience massively. https://sciml.ai/
-
SciPy: Interested in adopting PRIMA, but little appetite for more Fortran code
Interesting response. I develop the Julia SciML organization https://sciml.ai/ and we'd be more than happy to work with you to get wrappers for PRIMA into Optimization.jl's general interface (https://docs.sciml.ai/Optimization/stable/). Please get in touch and we can figure out how to set this all up. I personally would be curious to try this out and do some benchmarks against nlopt methods.
-
Julia 1.9: A New Era of Performance and Flexibility
Overall, your analysis is very Python centric. It's not very clear to me why Julia should focus on convincing Python users or developers. There are many areas of numerical and scientific computing that are not well served by Python, and it's exactly those areas that Julia is pushing into. The whole SciML https://sciml.ai/ ecosystem is a great toolbox for writing models and optimizations that would have otherwise required FORTRAN, C, and MATLAB. Staying within Julia provides access to a consistent set of autodiff technologies to further accelerate those efforts.
- Can Fortran survive another 15 years?
-
A note from our sponsor - SaaSHub
www.saashub.com | 11 Nov 2024
Stats
SciML/SciPyDiffEq.jl is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of SciPyDiffEq.jl is Julia.
Popular Comparisons
- SciPyDiffEq.jl VS PowerSimulationsDynamics.jl
- SciPyDiffEq.jl VS KiteSimulators.jl
- SciPyDiffEq.jl VS Torch.jl
- SciPyDiffEq.jl VS fpm
- SciPyDiffEq.jl VS RecursiveFactorization
- SciPyDiffEq.jl VS Optimization.jl
- SciPyDiffEq.jl VS SciMLBenchmarks.jl
- SciPyDiffEq.jl VS stdlib
- SciPyDiffEq.jl VS RecursiveFactorization.jl
- SciPyDiffEq.jl VS prima