pybobyqa
prima
Our great sponsors
pybobyqa | prima | |
---|---|---|
1 | 10 | |
71 | 270 | |
- | 4.8% | |
5.8 | 9.9 | |
18 days ago | 2 days ago | |
Python | Fortran | |
GNU General Public License v3.0 only | BSD 3-clause "New" or "Revised" License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pybobyqa
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
I've used this, and it works nicely: https://github.com/numericalalgorithmsgroup/pybobyqa. I'd be happy if it were added to your project, then I could just use yours and have access to a bunch of alternatives with the same API.
prima
-
Nagfor supports half-precision floating-point numbers
1. nagfor Release 7.1(Hanzomon) Build 7149 released on March 5, 2024, fixed all the bugs spotted, but introduced an ICE when compiling PRIMA ( http://www.libprima.net ). The ICE has nothing to do with half-precision real, because it occurs when PRIMA is configured to use single or double precision. It can be reproduced by
```
git clone https://github.com/libprima/prima.git && cd prima && git checkout ec42cb0 && cd fortran/examples/lincoa && make ntest
```
2. nagfor 7.2 released on 6 March, 2024 included neither the ICE nor the fixes for the bugs.
- PRIMA: Solving general nonlinear optimization problems without derivatives
-
What are you rewriting in rust?
My goal is to rewrite this library for derivative-free optimization: https://github.com/libprima/prima
-
SciPy: Interested in adopting PRIMA, but little appetite for more Fortran code
A native port is indeed planned. However, since we are talking about a project of about 10K lines of code, such a port will not be delivered very soon.
In fact, native implementations of PRIMA in Python, MATLAB, C++, Julia, and R will all be done in the future. See https://github.com/libprima/prima#other-languages . But it takes time. PRIMA has been a one-man project since it started three yearss ago. Community help is greatly needed.
Thanks.
-
Optimization Without Using Derivatives: the PRIMA Package, its Fortran Implementation, and Its Inclusion in SciPy - Announcements
GitHub repo of the project: https://github.com/libprima/prima
-
Optimization Without Derivatives: Prima Fortran Version and Inclusion in SciPy
It sounds like this was a difficult task. The motivation to fulfill Prof. Powell's request and help the community of derivative-free optimization users must have been strong. Congratulations on your achievement!
From the GitHub README:
> In the past years, while working on PRIMA, I have spotted a dozen of bugs in reputable Fortran compilers and two bugs in MATLAB. Each of them represents days of bitter debugging, which finally led to the conclusion that it was not a problem in my code but a flaw in the Fortran compilers or in MATLAB. From a very unusual angle, this reflects how intensive the coding has been.
> The bitterness behind this "fun" fact is exactly why I work on PRIMA: I hope that all the frustrations that I have experienced will not happen to any user of Powell's methods anymore. I hope I am the last one in the world to decode a maze of 244 GOTOs in 7939 lines of Fortran 77 code — I have been doing this for three years and I do not want anyone else to do it again.
https://github.com/libprima/prima#a-fun-fact
- Optimization Without Using Derivatives
What are some alternatives?
Hyperactive - An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
solid-docs - Cumulative documentation for SolidJS and related packages.
tf-quant-finance - High-performance TensorFlow library for quantitative finance.
stdlib - Fortran Standard Library
PyGenetic - A multi-purpose genetic algorithm written in python
Optimization-Codes-by-ChatGPT - numerical optimization subroutines in Fortran generated by ChatGPT-4
WaveNCC - An app to compute the normalization coefficients of a given set of orthogonal 1D complex wave functions.
inox2d - Native Rust reimplementation of Inochi2D
Gradient-Free-Optimizers - Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
OfficerBreaker - OOXML password remover
BayesianOptimization - A Python implementation of global optimization with gaussian processes.
gmusicbrowser - jukebox for large collections of music