Nelder-Mead optimization algorithm. This is a gradient-free optimization algorithm. It is a direct search method that does not require the gradient of the objective function. The algorithm is based on the simplex method of Nelder and Mead (1965). The original source code was taken from (https://people.math.sc.edu/Burkardt/f_src/asa047/asa047.html)
Type | Visibility | Attributes | Name | Initial | |||
---|---|---|---|---|---|---|---|
real(kind=pr), | public | :: | convergence_tolerance | = | 1e-5 | ||
integer, | public | :: | kcount | = | 1e8 |
Maximum number of function evaluations |
|
integer, | public | :: | konvge | = | 1000 |
Convergence check is carried out every KONVGE iterations |
|
integer, | public | :: | max_iters | = | 10000 |
Maxium number of iterations |
|
real(kind=pr), | public, | allocatable | :: | parameter_step(:) | |||
real(kind=pr), | public | :: | solver_tolerance | = | 1e-9_pr | ||
logical, | public | :: | verbose |
Optimize the input function
Type | Intent | Optional | Attributes | Name | ||
---|---|---|---|---|---|---|
class(NelderMead), | intent(inout) | :: | self |
Optimizer |
||
procedure(obj_func) | :: | foo |
Objective function |
|||
real(kind=pr), | intent(inout) | :: | x(:) |
Initial guess and final result |
||
real(kind=pr), | intent(out) | :: | F |
Objective function value at final step |
||
class(*), | intent(inout), | optional, | target | :: | data |
Optional data for the objective function |