Gradient-Free-Optimizers
prima
Gradient-Free-Optimizers | prima | |
---|---|---|
11 | 13 | |
1,108 | 275 | |
- | 4.0% | |
5.0 | 9.9 | |
5 days ago | 4 days ago | |
Python | Fortran | |
MIT License | BSD 3-clause "New" or "Revised" License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Gradient-Free-Optimizers
- Show HN: Gradient-Free-Optimizers supports constrained optimization in v1.3
- Gradient-Free-Optimizers version 1.2 released
-
Gradient-Free-Optimizers A collection of modern optimization methods in Python
I would be very disappointed if that were the case.. no, it looks like it’s set up to capture variance. The BO algo wraps an “Expected Improvement Optimizer”:
https://github.com/SimonBlanke/Gradient-Free-Optimizers/blob...
Which selects new points based on both the model’s mean estimate and its variance. See around line 58
-
Hacker News top posts: Feb 28, 2021
Gradient-Free-Optimizers A collection of modern optimization methods in Python\ (0 comments)
- SimonBlanke/Gradient-Free-Optimizers A collection of modern optimization methods in Python
- Gradient-Free-Optimizers: A collection of modern optimization methods in Python
- Optimize any Python function with modern algorithms in numerical search spaces
prima
-
Prima has got a Python interface
The developer of PRIMA here.
If you use method "cobyla" from scipy.optimize.minimize, then PRIMA already performs far better (in terms of the number of function evaluations). See the comparison at https://github.com/libprima/prima#improvements .
The bugs are indeed only a secondary reason: they can only be triggered under special situations. They may not affect your usage at all (when it does affect you, the consequence is catastrophophic).
-
Nagfor supports half-precision floating-point numbers
1. nagfor Release 7.1(Hanzomon) Build 7149 released on March 5, 2024, fixed all the bugs spotted, but introduced an ICE when compiling PRIMA ( http://www.libprima.net ). The ICE has nothing to do with half-precision real, because it occurs when PRIMA is configured to use single or double precision. It can be reproduced by
```
git clone https://github.com/libprima/prima.git && cd prima && git checkout ec42cb0 && cd fortran/examples/lincoa && make ntest
```
2. nagfor 7.2 released on 6 March, 2024 included neither the ICE nor the fixes for the bugs.
- PRIMA: Solving general nonlinear optimization problems without derivatives
-
What are you rewriting in rust?
My goal is to rewrite this library for derivative-free optimization: https://github.com/libprima/prima
-
SciPy: Interested in adopting PRIMA, but little appetite for more Fortran code
A native port is indeed planned. However, since we are talking about a project of about 10K lines of code, such a port will not be delivered very soon.
In fact, native implementations of PRIMA in Python, MATLAB, C++, Julia, and R will all be done in the future. See https://github.com/libprima/prima#other-languages . But it takes time. PRIMA has been a one-man project since it started three yearss ago. Community help is greatly needed.
Thanks.
-
Optimization Without Using Derivatives: the PRIMA Package, its Fortran Implementation, and Its Inclusion in SciPy - Announcements
GitHub repo of the project: https://github.com/libprima/prima
-
Optimization Without Derivatives: Prima Fortran Version and Inclusion in SciPy
It sounds like this was a difficult task. The motivation to fulfill Prof. Powell's request and help the community of derivative-free optimization users must have been strong. Congratulations on your achievement!
From the GitHub README:
> In the past years, while working on PRIMA, I have spotted a dozen of bugs in reputable Fortran compilers and two bugs in MATLAB. Each of them represents days of bitter debugging, which finally led to the conclusion that it was not a problem in my code but a flaw in the Fortran compilers or in MATLAB. From a very unusual angle, this reflects how intensive the coding has been.
> The bitterness behind this "fun" fact is exactly why I work on PRIMA: I hope that all the frustrations that I have experienced will not happen to any user of Powell's methods anymore. I hope I am the last one in the world to decode a maze of 244 GOTOs in 7939 lines of Fortran 77 code — I have been doing this for three years and I do not want anyone else to do it again.
https://github.com/libprima/prima#a-fun-fact
- Optimization Without Using Derivatives
What are some alternatives?
Hyperactive - An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
solid-docs - Cumulative documentation for SolidJS and related packages.
opytimizer - 🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
stdlib - Fortran Standard Library
surrogate-models - A collection of surrogate models for sequence model based optimization techniques
pybobyqa - Python-based Derivative-Free Optimization with Bound Constraints
Optimization-Codes-by-ChatGPT - numerical optimization subroutines in Fortran generated by ChatGPT-4
optimization-tutorial - Tutorials for the optimization techniques used in Gradient-Free-Optimizers and Hyperactive.
inox2d - Native Rust reimplementation of Inochi2D
urh - Universal Radio Hacker: Investigate Wireless Protocols Like A Boss
OfficerBreaker - OOXML password remover