SciPyDiffEq.jl VS prima

Compare SciPyDiffEq.jl vs prima and see what are their differences.

SciPyDiffEq.jl

Wrappers for the SciPy differential equation solvers for the SciML Scientific Machine Learning organization (by SciML)

prima

PRIMA is a package for solving general nonlinear optimization problems without using derivatives. It provides the reference implementation for Powell's derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. PRIMA means Reference Implementation for Powell's methods with Modernization and Amelioration, P for Powell. (by libprima)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
SciPyDiffEq.jl prima
4 13
21 279
- 5.4%
4.8 9.9
14 days ago 6 days ago
Julia Fortran
MIT License BSD 3-clause "New" or "Revised" License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

SciPyDiffEq.jl

Posts with mentions or reviews of SciPyDiffEq.jl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-05-18.
  • Good linear algebra libraries
    1 project | /r/Julia | 19 May 2023
    Check out the SciML ecosystem. They are doing amazing work in that space. You might also want to integrate your methods with their libraries, as it will boost their potential audience massively. https://sciml.ai/
  • SciPy: Interested in adopting PRIMA, but little appetite for more Fortran code
    8 projects | news.ycombinator.com | 18 May 2023
    Interesting response. I develop the Julia SciML organization https://sciml.ai/ and we'd be more than happy to work with you to get wrappers for PRIMA into Optimization.jl's general interface (https://docs.sciml.ai/Optimization/stable/). Please get in touch and we can figure out how to set this all up. I personally would be curious to try this out and do some benchmarks against nlopt methods.
  • Julia 1.9: A New Era of Performance and Flexibility
    3 projects | /r/Julia | 14 May 2023
    Overall, your analysis is very Python centric. It's not very clear to me why Julia should focus on convincing Python users or developers. There are many areas of numerical and scientific computing that are not well served by Python, and it's exactly those areas that Julia is pushing into. The whole SciML https://sciml.ai/ ecosystem is a great toolbox for writing models and optimizations that would have otherwise required FORTRAN, C, and MATLAB. Staying within Julia provides access to a consistent set of autodiff technologies to further accelerate those efforts.
  • Can Fortran survive another 15 years?
    7 projects | news.ycombinator.com | 1 May 2023
    What about the other benchmarks on the same site? https://docs.sciml.ai/SciMLBenchmarksOutput/stable/Bio/BCR/ BCR takes about a hundred seconds and is pretty indicative of systems biological models, coming from 1122 ODEs with 24388 terms that describe a stiff chemical reaction network modeling the BCR signaling network from Barua et al. Or the discrete diffusion models https://docs.sciml.ai/SciMLBenchmarksOutput/stable/Jumps/Dif... which are the justification behind the claims in https://www.biorxiv.org/content/10.1101/2022.07.30.502135v1 that the O(1) scaling methods scale better than O(log n) scaling for large enough models? I mean.

    > If you use special routines (BLAS/LAPACK, ...), use them everywhere as the respective community does.

    It tests with and with BLAS/LAPACK (which isn't always helpful, which of course you'd see from the benchmarks if you read them). One of the key differences of course though is that there are some pure Julia tools like https://github.com/JuliaLinearAlgebra/RecursiveFactorization... which outperform the respective OpenBLAS/MKL equivalent in many scenarios, and that's one noted factor for the performance boost (and is not trivial to wrap into the interface of the other solvers, so it's not done). There are other benchmarks showing that it's not apples to apples and is instead conservative in many cases, for example https://github.com/SciML/SciPyDiffEq.jl#measuring-overhead showing the SciPyDiffEq handling with the Julia JIT optimizations gives a lower overhead than direct SciPy+Numba, so we use the lower overhead numbers in https://docs.sciml.ai/SciMLBenchmarksOutput/stable/MultiLang....

    > you must compile/write whole programs in each of the respective languages to enable full compiler/interpreter optimizations

    You do realize that a .so has lower overhead to call from a JIT compiled language than from a static compiled language like C because you can optimize away some of the bindings at the runtime right? https://github.com/dyu/ffi-overhead is a measurement of that, and you see LuaJIT and Julia as faster than C and Fortran here. This shouldn't be surprising because it's pretty clear how that works?

    I mean yes, someone can always ask for more benchmarks, but now we have a site that's auto updating tons and tons of ODE benchmarks with ODE systems ranging from size 2 to the thousands, with as many things as we can wrap in as many scenarios as we can wrap. And we don't even "win" all of our benchmarks because unlike for you, these benchmarks aren't for winning but for tracking development (somehow for Hacker News folks they ignore the utility part and go straight to language wars...).

    If you have a concrete change you think can improve the benchmarks, then please share it at https://github.com/SciML/SciMLBenchmarks.jl. We'll be happy to make and maintain another.

prima

Posts with mentions or reviews of prima. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-05-03.
  • Prima has got a Python interface
    3 projects | news.ycombinator.com | 3 May 2024
    The developer of PRIMA here.

    If you use method "cobyla" from scipy.optimize.minimize, then PRIMA already performs far better (in terms of the number of function evaluations). See the comparison at https://github.com/libprima/prima#improvements .

    The bugs are indeed only a secondary reason: they can only be triggered under special situations. They may not affect your usage at all (when it does affect you, the consequence is catastrophophic).

  • Nagfor supports half-precision floating-point numbers
    1 project | news.ycombinator.com | 6 Mar 2024
    1. nagfor Release 7.1(Hanzomon) Build 7149 released on March 5, 2024, fixed all the bugs spotted, but introduced an ICE when compiling PRIMA ( http://www.libprima.net ). The ICE has nothing to do with half-precision real, because it occurs when PRIMA is configured to use single or double precision. It can be reproduced by

    ```

    git clone https://github.com/libprima/prima.git && cd prima && git checkout ec42cb0 && cd fortran/examples/lincoa && make ntest

    ```

    2. nagfor 7.2 released on 6 March, 2024 included neither the ICE nor the fixes for the bugs.

  • PRIMA: Solving general nonlinear optimization problems without derivatives
    1 project | news.ycombinator.com | 28 Feb 2024
  • What are you rewriting in rust?
    36 projects | /r/rust | 10 Jul 2023
    My goal is to rewrite this library for derivative-free optimization: https://github.com/libprima/prima
  • SciPy: Interested in adopting PRIMA, but little appetite for more Fortran code
    8 projects | news.ycombinator.com | 18 May 2023
    A native port is indeed planned. However, since we are talking about a project of about 10K lines of code, such a port will not be delivered very soon.

    In fact, native implementations of PRIMA in Python, MATLAB, C++, Julia, and R will all be done in the future. See https://github.com/libprima/prima#other-languages . But it takes time. PRIMA has been a one-man project since it started three yearss ago. Community help is greatly needed.

    Thanks.

  • Optimization Without Using Derivatives: the PRIMA Package, its Fortran Implementation, and Its Inclusion in SciPy - Announcements
    1 project | /r/programming | 17 May 2023
    GitHub repo of the project: https://github.com/libprima/prima
  • Optimization Without Derivatives: Prima Fortran Version and Inclusion in SciPy
    8 projects | news.ycombinator.com | 16 May 2023
    It sounds like this was a difficult task. The motivation to fulfill Prof. Powell's request and help the community of derivative-free optimization users must have been strong. Congratulations on your achievement!

    From the GitHub README:

    > In the past years, while working on PRIMA, I have spotted a dozen of bugs in reputable Fortran compilers and two bugs in MATLAB. Each of them represents days of bitter debugging, which finally led to the conclusion that it was not a problem in my code but a flaw in the Fortran compilers or in MATLAB. From a very unusual angle, this reflects how intensive the coding has been.

    > The bitterness behind this "fun" fact is exactly why I work on PRIMA: I hope that all the frustrations that I have experienced will not happen to any user of Powell's methods anymore. I hope I am the last one in the world to decode a maze of 244 GOTOs in 7939 lines of Fortran 77 code — I have been doing this for three years and I do not want anyone else to do it again.

    https://github.com/libprima/prima#a-fun-fact

  • Optimization Without Using Derivatives
    2 projects | news.ycombinator.com | 21 Apr 2023

What are some alternatives?

When comparing SciPyDiffEq.jl and prima you can also consider the following projects:

PowerSimulationsDynamics.jl - Julia package to run Dynamic Power System simulations. Part of the Scalable Integrated Infrastructure Planning Initiative at the National Renewable Energy Lab.

solid-docs - Cumulative documentation for SolidJS and related packages.

KiteSimulators.jl - Simulators for kite power systems

stdlib - Fortran Standard Library

Torch.jl - Sensible extensions for exposing torch in Julia.

pybobyqa - Python-based Derivative-Free Optimization with Bound Constraints

Optimization.jl - Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.

Optimization-Codes-by-ChatGPT - numerical optimization subroutines in Fortran generated by ChatGPT-4

SciMLBenchmarks.jl - Scientific machine learning (SciML) benchmarks, AI for science, and (differential) equation solvers. Covers Julia, Python (PyTorch, Jax), MATLAB, R

inox2d - Native Rust reimplementation of Inochi2D

fpm - Fortran Package Manager (fpm)

OfficerBreaker - OOXML password remover