Skip to content

Commit 4a9ddcc

Browse files
authored
added initial wrapper for migrad (#568)
1 parent b63836f commit 4a9ddcc

File tree

13 files changed

+331
-0
lines changed

13 files changed

+331
-0
lines changed

.tools/envs/testenv-linux.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,7 @@ dependencies:
2727
- pyyaml # dev, tests
2828
- jinja2 # dev, tests
2929
- annotated-types # dev, tests
30+
- iminuit # dev, tests
3031
- pip: # dev, tests, docs
3132
- DFO-LS>=1.5.3 # dev, tests
3233
- Py-BOBYQA # dev, tests

.tools/envs/testenv-numpy.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,7 @@ dependencies:
2525
- pyyaml # dev, tests
2626
- jinja2 # dev, tests
2727
- annotated-types # dev, tests
28+
- iminuit # dev, tests
2829
- pip: # dev, tests, docs
2930
- DFO-LS>=1.5.3 # dev, tests
3031
- Py-BOBYQA # dev, tests

.tools/envs/testenv-others.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,7 @@ dependencies:
2525
- pyyaml # dev, tests
2626
- jinja2 # dev, tests
2727
- annotated-types # dev, tests
28+
- iminuit # dev, tests
2829
- pip: # dev, tests, docs
2930
- DFO-LS>=1.5.3 # dev, tests
3031
- Py-BOBYQA # dev, tests

.tools/envs/testenv-pandas.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,7 @@ dependencies:
2525
- pyyaml # dev, tests
2626
- jinja2 # dev, tests
2727
- annotated-types # dev, tests
28+
- iminuit # dev, tests
2829
- pip: # dev, tests, docs
2930
- DFO-LS>=1.5.3 # dev, tests
3031
- Py-BOBYQA # dev, tests

docs/source/algorithms.md

Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3936,6 +3936,54 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
39363936
10 * (number of parameters + 1).
39373937
```
39383938

3939+
## Optimizers from iminuit
3940+
3941+
optimagic supports the [IMINUIT MIGRAD Optimizer](https://iminuit.readthedocs.io/). To
3942+
use MIGRAD, you need to have
3943+
[the iminuit package](https://github.com/scikit-hep/iminuit) installed (pip install
3944+
iminuit).
3945+
3946+
```{eval-rst}
3947+
.. dropdown:: iminuit_migrad
3948+
3949+
.. code-block::
3950+
3951+
"iminuit_migrad"
3952+
3953+
`MIGRAD <https://iminuit.readthedocs.io/en/stable/reference.html#iminuit.Minuit.migrad>`_ is
3954+
the workhorse algorithm of the MINUIT optimization suite, which has been widely used in the
3955+
high-energy physics community since 1975. The IMINUIT package is a Python interface to the
3956+
Minuit2 C++ library developed by CERN.
3957+
3958+
Migrad uses a quasi-Newton method, updating the Hessian matrix iteratively
3959+
to guide the optimization. The algorithm adapts dynamically to challenging landscapes
3960+
using several key techniques:
3961+
3962+
- **Quasi-Newton updates**: The Hessian is updated iteratively rather than recalculated at
3963+
each step, improving efficiency.
3964+
- **Steepest descent fallback**: When the Hessian update fails, Migrad falls back to steepest
3965+
descent with line search.
3966+
- **Box constraints handling**: Parameters with bounds are transformed internally to ensure
3967+
they remain within allowed limits.
3968+
- **Heuristics for numerical stability**: Special cases such as flat gradients or singular
3969+
Hessians are managed using pre-defined heuristics.
3970+
- **Stopping criteria based on Estimated Distance to Minimum (EDM)**: The optimization halts
3971+
when the predicted improvement becomes sufficiently small.
3972+
3973+
For details see :cite:`JAMES1975343`.
3974+
3975+
**Optimizer Parameters:**
3976+
3977+
- **stopping.maxfun** (int): Maximum number of function evaluations. If reached, the optimization stops
3978+
but this is not counted as successful convergence. Function evaluations used for numerical gradient
3979+
calculations do not count toward this limit. Default is 1,000,000.
3980+
3981+
- **n_restarts** (int): Number of times to restart the optimizer if convergence is not reached.
3982+
3983+
- A value of 1 (the default) indicates that the optimizer will only run once, disabling the restart feature.
3984+
- Values greater than 1 specify the maximum number of restart attempts.
3985+
```
3986+
39393987
## References
39403988

39413989
```{eval-rst}

docs/source/refs.bib

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -893,4 +893,17 @@ @book{Conn2009
893893
URL = {https://epubs.siam.org/doi/abs/10.1137/1.9780898718768},
894894
}
895895

896+
@article{JAMES1975343,
897+
title = {Minuit - a system for function minimization and analysis of the parameter errors and correlations},
898+
journal = {Computer Physics Communications},
899+
volume = {10},
900+
number = {6},
901+
pages = {343-367},
902+
year = {1975},
903+
issn = {0010-4655},
904+
doi = {https://doi.org/10.1016/0010-4655(75)90039-9},
905+
url = {https://www.sciencedirect.com/science/article/pii/0010465575900399},
906+
author = {F. James and M. Roos}
907+
}
908+
896909
@Comment{jabref-meta: databaseType:bibtex;}

environment.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ dependencies:
3737
- jinja2 # dev, tests
3838
- furo # dev, docs
3939
- annotated-types # dev, tests
40+
- iminuit # dev, tests
4041
- pip: # dev, tests, docs
4142
- DFO-LS>=1.5.3 # dev, tests
4243
- Py-BOBYQA # dev, tests

pyproject.toml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@ dependencies = [
1616
"sqlalchemy>=1.3",
1717
"annotated-types",
1818
"typing-extensions",
19+
"iminuit",
1920
]
2021
dynamic = ["version"]
2122
keywords = [
@@ -378,5 +379,6 @@ module = [
378379
"optimagic._version",
379380
"annotated_types",
380381
"pdbp",
382+
"iminuit",
381383
]
382384
ignore_missing_imports = true

src/optimagic/algorithms.py

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@
1414
from optimagic.optimization.algorithm import Algorithm
1515
from optimagic.optimizers.bhhh import BHHH
1616
from optimagic.optimizers.fides import Fides
17+
from optimagic.optimizers.iminuit_migrad import IminuitMigrad
1718
from optimagic.optimizers.ipopt import Ipopt
1819
from optimagic.optimizers.nag_optimizers import NagDFOLS, NagPyBOBYQA
1920
from optimagic.optimizers.neldermead import NelderMeadParallel
@@ -286,6 +287,7 @@ def Scalar(self) -> BoundedGradientBasedLocalNonlinearConstrainedScalarAlgorithm
286287
@dataclass(frozen=True)
287288
class BoundedGradientBasedLocalScalarAlgorithms(AlgoSelection):
288289
fides: Type[Fides] = Fides
290+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
289291
ipopt: Type[Ipopt] = Ipopt
290292
nlopt_ccsaq: Type[NloptCCSAQ] = NloptCCSAQ
291293
nlopt_lbfgsb: Type[NloptLBFGSB] = NloptLBFGSB
@@ -840,6 +842,7 @@ def NonlinearConstrained(
840842
@dataclass(frozen=True)
841843
class BoundedGradientBasedLocalAlgorithms(AlgoSelection):
842844
fides: Type[Fides] = Fides
845+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
843846
ipopt: Type[Ipopt] = Ipopt
844847
nlopt_ccsaq: Type[NloptCCSAQ] = NloptCCSAQ
845848
nlopt_lbfgsb: Type[NloptLBFGSB] = NloptLBFGSB
@@ -889,6 +892,7 @@ def Scalar(self) -> GradientBasedLocalNonlinearConstrainedScalarAlgorithms:
889892
@dataclass(frozen=True)
890893
class GradientBasedLocalScalarAlgorithms(AlgoSelection):
891894
fides: Type[Fides] = Fides
895+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
892896
ipopt: Type[Ipopt] = Ipopt
893897
nlopt_ccsaq: Type[NloptCCSAQ] = NloptCCSAQ
894898
nlopt_lbfgsb: Type[NloptLBFGSB] = NloptLBFGSB
@@ -956,6 +960,7 @@ def Scalar(self) -> BoundedGradientBasedNonlinearConstrainedScalarAlgorithms:
956960
@dataclass(frozen=True)
957961
class BoundedGradientBasedScalarAlgorithms(AlgoSelection):
958962
fides: Type[Fides] = Fides
963+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
959964
ipopt: Type[Ipopt] = Ipopt
960965
nlopt_ccsaq: Type[NloptCCSAQ] = NloptCCSAQ
961966
nlopt_lbfgsb: Type[NloptLBFGSB] = NloptLBFGSB
@@ -1674,6 +1679,7 @@ def Scalar(self) -> BoundedLocalNonlinearConstrainedScalarAlgorithms:
16741679
@dataclass(frozen=True)
16751680
class BoundedLocalScalarAlgorithms(AlgoSelection):
16761681
fides: Type[Fides] = Fides
1682+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
16771683
ipopt: Type[Ipopt] = Ipopt
16781684
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA
16791685
nlopt_bobyqa: Type[NloptBOBYQA] = NloptBOBYQA
@@ -1943,6 +1949,7 @@ def Scalar(self) -> GlobalGradientBasedScalarAlgorithms:
19431949
class GradientBasedLocalAlgorithms(AlgoSelection):
19441950
bhhh: Type[BHHH] = BHHH
19451951
fides: Type[Fides] = Fides
1952+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
19461953
ipopt: Type[Ipopt] = Ipopt
19471954
nlopt_ccsaq: Type[NloptCCSAQ] = NloptCCSAQ
19481955
nlopt_lbfgsb: Type[NloptLBFGSB] = NloptLBFGSB
@@ -1985,6 +1992,7 @@ def Scalar(self) -> GradientBasedLocalScalarAlgorithms:
19851992
@dataclass(frozen=True)
19861993
class BoundedGradientBasedAlgorithms(AlgoSelection):
19871994
fides: Type[Fides] = Fides
1995+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
19881996
ipopt: Type[Ipopt] = Ipopt
19891997
nlopt_ccsaq: Type[NloptCCSAQ] = NloptCCSAQ
19901998
nlopt_lbfgsb: Type[NloptLBFGSB] = NloptLBFGSB
@@ -2054,6 +2062,7 @@ def Scalar(self) -> GradientBasedNonlinearConstrainedScalarAlgorithms:
20542062
@dataclass(frozen=True)
20552063
class GradientBasedScalarAlgorithms(AlgoSelection):
20562064
fides: Type[Fides] = Fides
2065+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
20572066
ipopt: Type[Ipopt] = Ipopt
20582067
nlopt_ccsaq: Type[NloptCCSAQ] = NloptCCSAQ
20592068
nlopt_lbfgsb: Type[NloptLBFGSB] = NloptLBFGSB
@@ -2577,6 +2586,7 @@ def Scalar(self) -> GlobalParallelScalarAlgorithms:
25772586
@dataclass(frozen=True)
25782587
class BoundedLocalAlgorithms(AlgoSelection):
25792588
fides: Type[Fides] = Fides
2589+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
25802590
ipopt: Type[Ipopt] = Ipopt
25812591
nag_dfols: Type[NagDFOLS] = NagDFOLS
25822592
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA
@@ -2659,6 +2669,7 @@ def Scalar(self) -> LocalNonlinearConstrainedScalarAlgorithms:
26592669
@dataclass(frozen=True)
26602670
class LocalScalarAlgorithms(AlgoSelection):
26612671
fides: Type[Fides] = Fides
2672+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
26622673
ipopt: Type[Ipopt] = Ipopt
26632674
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA
26642675
neldermead_parallel: Type[NelderMeadParallel] = NelderMeadParallel
@@ -2809,6 +2820,7 @@ def Scalar(self) -> BoundedNonlinearConstrainedScalarAlgorithms:
28092820
@dataclass(frozen=True)
28102821
class BoundedScalarAlgorithms(AlgoSelection):
28112822
fides: Type[Fides] = Fides
2823+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
28122824
ipopt: Type[Ipopt] = Ipopt
28132825
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA
28142826
nlopt_bobyqa: Type[NloptBOBYQA] = NloptBOBYQA
@@ -3063,6 +3075,7 @@ def Local(self) -> LeastSquaresLocalParallelAlgorithms:
30633075
class GradientBasedAlgorithms(AlgoSelection):
30643076
bhhh: Type[BHHH] = BHHH
30653077
fides: Type[Fides] = Fides
3078+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
30663079
ipopt: Type[Ipopt] = Ipopt
30673080
nlopt_ccsaq: Type[NloptCCSAQ] = NloptCCSAQ
30683081
nlopt_lbfgsb: Type[NloptLBFGSB] = NloptLBFGSB
@@ -3246,6 +3259,7 @@ def Scalar(self) -> GlobalScalarAlgorithms:
32463259
class LocalAlgorithms(AlgoSelection):
32473260
bhhh: Type[BHHH] = BHHH
32483261
fides: Type[Fides] = Fides
3262+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
32493263
ipopt: Type[Ipopt] = Ipopt
32503264
nag_dfols: Type[NagDFOLS] = NagDFOLS
32513265
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA
@@ -3316,6 +3330,7 @@ def Scalar(self) -> LocalScalarAlgorithms:
33163330
@dataclass(frozen=True)
33173331
class BoundedAlgorithms(AlgoSelection):
33183332
fides: Type[Fides] = Fides
3333+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
33193334
ipopt: Type[Ipopt] = Ipopt
33203335
nag_dfols: Type[NagDFOLS] = NagDFOLS
33213336
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA
@@ -3451,6 +3466,7 @@ def Scalar(self) -> NonlinearConstrainedScalarAlgorithms:
34513466
@dataclass(frozen=True)
34523467
class ScalarAlgorithms(AlgoSelection):
34533468
fides: Type[Fides] = Fides
3469+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
34543470
ipopt: Type[Ipopt] = Ipopt
34553471
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA
34563472
neldermead_parallel: Type[NelderMeadParallel] = NelderMeadParallel
@@ -3625,6 +3641,7 @@ def Scalar(self) -> ParallelScalarAlgorithms:
36253641
class Algorithms(AlgoSelection):
36263642
bhhh: Type[BHHH] = BHHH
36273643
fides: Type[Fides] = Fides
3644+
iminuit_migrad: Type[IminuitMigrad] = IminuitMigrad
36283645
ipopt: Type[Ipopt] = Ipopt
36293646
nag_dfols: Type[NagDFOLS] = NagDFOLS
36303647
nag_pybobyqa: Type[NagPyBOBYQA] = NagPyBOBYQA

src/optimagic/config.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -92,6 +92,14 @@
9292
IS_NUMBA_INSTALLED = True
9393

9494

95+
try:
96+
import iminuit # noqa: F401
97+
except ImportError:
98+
IS_IMINUIT_INSTALLED = False
99+
else:
100+
IS_IMINUIT_INSTALLED = True
101+
102+
95103
# ======================================================================================
96104
# Check if pandas version is newer or equal to version 2.1.0
97105
# ======================================================================================

src/optimagic/optimization/algo_options.py

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -122,6 +122,19 @@
122122
123123
"""
124124

125+
N_RESTARTS = 1
126+
"""int: Number of times to restart the optimizer if convergence is not reached.
127+
This parameter controls how many times the optimization process is restarted
128+
in an attempt to achieve convergence.
129+
130+
- A value of 1 (the default) indicates that the optimizer will only run once,
131+
disabling the restart feature.
132+
- Values greater than 1 specify the maximum number of restart attempts.
133+
134+
Note: This is distinct from `STOPPING_MAXITER`, which limits the number of
135+
iterations within a single optimizer run, not the number of restarts.
136+
"""
137+
125138

126139
def get_population_size(population_size, x, lower_bound=10):
127140
"""Default population size for genetic algorithms."""

0 commit comments

Comments
 (0)