Optimization.jl VS SciPyDiffEq.jl

Compare Optimization.jl vs SciPyDiffEq.jl and see what are their differences.

Optimization.jl

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface. (by SciML)

SciPyDiffEq.jl

Wrappers for the SciPy differential equation solvers for the SciML Scientific Machine Learning organization (by SciML)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
Optimization.jl SciPyDiffEq.jl
3 4
663 21
2.1% -
9.7 4.8
6 days ago 6 days ago
Julia Julia
MIT License MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Optimization.jl

Posts with mentions or reviews of Optimization.jl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-05-18.
  • SciPy: Interested in adopting PRIMA, but little appetite for more Fortran code
    8 projects | news.ycombinator.com | 18 May 2023
    Interesting response. I develop the Julia SciML organization https://sciml.ai/ and we'd be more than happy to work with you to get wrappers for PRIMA into Optimization.jl's general interface (https://docs.sciml.ai/Optimization/stable/). Please get in touch and we can figure out how to set this all up. I personally would be curious to try this out and do some benchmarks against nlopt methods.
  • Help me to choose an optimization framework for my problem
    2 projects | /r/Julia | 11 Mar 2023
    There are also Optimization and Nonconvex , which seem like umbrella packages and I am not sure what methods to use inside these packages. Any help on these?
  • The Julia language has a number of correctness flaws
    19 projects | news.ycombinator.com | 16 May 2022
    > but would you say most packages follow or enforce SemVer?

    The package ecosystem pretty much requires SemVer. If you just say `PackageX = "1"` inside of a Project.toml [compat], then it will assume SemVer, i.e. any version 1.x is non-breaking an thus allowed, but not version 2. Some (but very few) packages do `PackageX = ">=1"`, so you could say Julia doesn't force SemVar (because a package can say that it explicitly believes it's compatible with all future versions), but of course that's nonsense and there will always be some bad actors around. So then:

    > Would enforcing a stricter dependency graph fix some of the foot guns of using packages or would that limit composability of packages too much?

    That's not the issue. As above, the dependency graphs are very strict. The issue is always at the periphery (for any package ecosystem really). In Julia, one thing that can amplify it is the fact that Requires.jl, the hacky conditional dependency system that is very not recommended for many reasons, cannot specify version requirements on conditional dependencies. I find this to be the root cause of most issues in the "flow" of the package development ecosystem. Most packages are okay, but then oh, I don't want to depend on CUDA for this feature, so a little bit of Requires.jl here, and oh let me do a small hack for OffSetArrays. And now these little hacky features on the edge are both less tested and not well versioned.

    Thankfully there's a better way to do it by using multi-package repositories with subpackages. For example, https://github.com/SciML/GalacticOptim.jl is a global interface for lots of different optimization libraries, and you can see all of the different subpackages here https://github.com/SciML/GalacticOptim.jl/tree/master/lib. This lets there be a GalacticOptim and then a GalacticBBO package, each with versioning, but with tests being different while allowing easy co-development of the parts. Very few packages in the Julia ecosystem actually use this (I only know of one other package in Julia making use of this) because the tooling only recently was able to support it, but this is how a lot of packages should be going.

    The upside too is that Requires.jl optional dependency handling is by far and away the main source of loading time issues in Julia (because it blocks precompilation in many ways). So it's really killing two birds with one stone: decreasing package load times by about 99% (that's not even a joke, it's the huge majority of the time for most packages which are not StaticArrays.jl) while making version dependencies stricter. And now you know what I'm doing this week and what the next blog post will be on haha.

SciPyDiffEq.jl

Posts with mentions or reviews of SciPyDiffEq.jl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-05-18.
  • Good linear algebra libraries
    1 project | /r/Julia | 19 May 2023
    Check out the SciML ecosystem. They are doing amazing work in that space. You might also want to integrate your methods with their libraries, as it will boost their potential audience massively. https://sciml.ai/
  • SciPy: Interested in adopting PRIMA, but little appetite for more Fortran code
    8 projects | news.ycombinator.com | 18 May 2023
    Interesting response. I develop the Julia SciML organization https://sciml.ai/ and we'd be more than happy to work with you to get wrappers for PRIMA into Optimization.jl's general interface (https://docs.sciml.ai/Optimization/stable/). Please get in touch and we can figure out how to set this all up. I personally would be curious to try this out and do some benchmarks against nlopt methods.
  • Julia 1.9: A New Era of Performance and Flexibility
    3 projects | /r/Julia | 14 May 2023
    Overall, your analysis is very Python centric. It's not very clear to me why Julia should focus on convincing Python users or developers. There are many areas of numerical and scientific computing that are not well served by Python, and it's exactly those areas that Julia is pushing into. The whole SciML https://sciml.ai/ ecosystem is a great toolbox for writing models and optimizations that would have otherwise required FORTRAN, C, and MATLAB. Staying within Julia provides access to a consistent set of autodiff technologies to further accelerate those efforts.
  • Can Fortran survive another 15 years?
    7 projects | news.ycombinator.com | 1 May 2023
    What about the other benchmarks on the same site? https://docs.sciml.ai/SciMLBenchmarksOutput/stable/Bio/BCR/ BCR takes about a hundred seconds and is pretty indicative of systems biological models, coming from 1122 ODEs with 24388 terms that describe a stiff chemical reaction network modeling the BCR signaling network from Barua et al. Or the discrete diffusion models https://docs.sciml.ai/SciMLBenchmarksOutput/stable/Jumps/Dif... which are the justification behind the claims in https://www.biorxiv.org/content/10.1101/2022.07.30.502135v1 that the O(1) scaling methods scale better than O(log n) scaling for large enough models? I mean.

    > If you use special routines (BLAS/LAPACK, ...), use them everywhere as the respective community does.

    It tests with and with BLAS/LAPACK (which isn't always helpful, which of course you'd see from the benchmarks if you read them). One of the key differences of course though is that there are some pure Julia tools like https://github.com/JuliaLinearAlgebra/RecursiveFactorization... which outperform the respective OpenBLAS/MKL equivalent in many scenarios, and that's one noted factor for the performance boost (and is not trivial to wrap into the interface of the other solvers, so it's not done). There are other benchmarks showing that it's not apples to apples and is instead conservative in many cases, for example https://github.com/SciML/SciPyDiffEq.jl#measuring-overhead showing the SciPyDiffEq handling with the Julia JIT optimizations gives a lower overhead than direct SciPy+Numba, so we use the lower overhead numbers in https://docs.sciml.ai/SciMLBenchmarksOutput/stable/MultiLang....

    > you must compile/write whole programs in each of the respective languages to enable full compiler/interpreter optimizations

    You do realize that a .so has lower overhead to call from a JIT compiled language than from a static compiled language like C because you can optimize away some of the bindings at the runtime right? https://github.com/dyu/ffi-overhead is a measurement of that, and you see LuaJIT and Julia as faster than C and Fortran here. This shouldn't be surprising because it's pretty clear how that works?

    I mean yes, someone can always ask for more benchmarks, but now we have a site that's auto updating tons and tons of ODE benchmarks with ODE systems ranging from size 2 to the thousands, with as many things as we can wrap in as many scenarios as we can wrap. And we don't even "win" all of our benchmarks because unlike for you, these benchmarks aren't for winning but for tracking development (somehow for Hacker News folks they ignore the utility part and go straight to language wars...).

    If you have a concrete change you think can improve the benchmarks, then please share it at https://github.com/SciML/SciMLBenchmarks.jl. We'll be happy to make and maintain another.

What are some alternatives?

When comparing Optimization.jl and SciPyDiffEq.jl you can also consider the following projects:

StatsBase.jl - Basic statistics for Julia

PowerSimulationsDynamics.jl - Julia package to run Dynamic Power System simulations. Part of the Scalable Integrated Infrastructure Planning Initiative at the National Renewable Energy Lab.

Petalisp - Elegant High Performance Computing

KiteSimulators.jl - Simulators for kite power systems

OffsetArrays.jl - Fortran-like arrays with arbitrary, zero or negative starting indices.

Torch.jl - Sensible extensions for exposing torch in Julia.

avm - Efficient and expressive arrayed vector math library with multi-threading and CUDA support in Common Lisp.

SciMLBenchmarks.jl - Scientific machine learning (SciML) benchmarks, AI for science, and (differential) equation solvers. Covers Julia, Python (PyTorch, Jax), MATLAB, R

Distributions.jl - A Julia package for probability distributions and associated functions.

fpm - Fortran Package Manager (fpm)

StaticLint.jl - Static Code Analysis for Julia

prima - PRIMA is a package for solving general nonlinear optimization problems without using derivatives. It provides the reference implementation for Powell's derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. PRIMA means Reference Implementation for Powell's methods with Modernization and Amelioration, P for Powell.