Skip to content

Commit d3ddfed

Browse files
committed
update docs for new scipy optimization backend
1 parent 2e7571e commit d3ddfed

File tree

7 files changed

+397
-5
lines changed

7 files changed

+397
-5
lines changed

docs/source/_snippets/user_guide/optimizers.py

Lines changed: 111 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -225,6 +225,117 @@ def objective(params):
225225
# [end:optuna_tpe]
226226

227227

228+
# ============================================================================
229+
# Scipy Backend
230+
# ============================================================================
231+
232+
# [start:scipy_imports]
233+
from hyperactive.opt.scipy import (
234+
ScipyDifferentialEvolution, # Global: population-based
235+
ScipyDualAnnealing, # Global: simulated annealing variant
236+
ScipyBasinhopping, # Global: random perturbations + local search
237+
ScipySHGO, # Global: finds multiple local minima
238+
ScipyDirect, # Global: deterministic DIRECT algorithm
239+
ScipyNelderMead, # Local: simplex-based
240+
ScipyPowell, # Local: conjugate direction method
241+
)
242+
# [end:scipy_imports]
243+
244+
245+
# Scipy uses continuous search spaces (tuples instead of arrays)
246+
scipy_search_space = {
247+
"x": (-5.0, 5.0),
248+
"y": (-5.0, 5.0),
249+
}
250+
251+
252+
# [start:scipy_differential_evolution]
253+
from hyperactive.opt.scipy import ScipyDifferentialEvolution
254+
255+
optimizer = ScipyDifferentialEvolution(
256+
param_space=scipy_search_space,
257+
n_iter=100,
258+
experiment=objective,
259+
strategy="best1bin",
260+
random_state=42,
261+
)
262+
# [end:scipy_differential_evolution]
263+
264+
265+
# [start:scipy_dual_annealing]
266+
from hyperactive.opt.scipy import ScipyDualAnnealing
267+
268+
optimizer = ScipyDualAnnealing(
269+
param_space=scipy_search_space,
270+
n_iter=100,
271+
experiment=objective,
272+
random_state=42,
273+
)
274+
# [end:scipy_dual_annealing]
275+
276+
277+
# [start:scipy_basinhopping]
278+
from hyperactive.opt.scipy import ScipyBasinhopping
279+
280+
optimizer = ScipyBasinhopping(
281+
param_space=scipy_search_space,
282+
n_iter=50,
283+
experiment=objective,
284+
minimizer_method="Nelder-Mead",
285+
random_state=42,
286+
)
287+
# [end:scipy_basinhopping]
288+
289+
290+
# [start:scipy_shgo]
291+
from hyperactive.opt.scipy import ScipySHGO
292+
293+
optimizer = ScipySHGO(
294+
param_space=scipy_search_space,
295+
n_iter=3,
296+
experiment=objective,
297+
n=50,
298+
sampling_method="simplicial",
299+
)
300+
# [end:scipy_shgo]
301+
302+
303+
# [start:scipy_direct]
304+
from hyperactive.opt.scipy import ScipyDirect
305+
306+
optimizer = ScipyDirect(
307+
param_space=scipy_search_space,
308+
n_iter=200,
309+
experiment=objective,
310+
locally_biased=True,
311+
)
312+
# [end:scipy_direct]
313+
314+
315+
# [start:scipy_nelder_mead]
316+
from hyperactive.opt.scipy import ScipyNelderMead
317+
318+
optimizer = ScipyNelderMead(
319+
param_space=scipy_search_space,
320+
n_iter=200,
321+
experiment=objective,
322+
random_state=42,
323+
)
324+
# [end:scipy_nelder_mead]
325+
326+
327+
# [start:scipy_powell]
328+
from hyperactive.opt.scipy import ScipyPowell
329+
330+
optimizer = ScipyPowell(
331+
param_space=scipy_search_space,
332+
n_iter=200,
333+
experiment=objective,
334+
random_state=42,
335+
)
336+
# [end:scipy_powell]
337+
338+
228339
# ============================================================================
229340
# Configuration Examples
230341
# ============================================================================

docs/source/api_reference/optimizers/index.rst

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ The :mod:`hyperactive.opt` module contains optimization algorithms for hyperpara
88
All optimizers inherit from :class:`~hyperactive.base.BaseOptimizer` and share the same interface:
99
the ``solve()`` method to run optimization, and configuration via the ``experiment`` and ``search_space`` parameters.
1010

11-
Hyperactive provides optimizers from three backends:
11+
Hyperactive provides optimizers from four backends:
1212

1313
.. list-table::
1414
:widths: 25 75
@@ -20,6 +20,8 @@ Hyperactive provides optimizers from three backends:
2020
- Native gradient-free optimization algorithms (21 optimizers)
2121
* - :doc:`optuna`
2222
- Interface to Optuna's samplers (8 optimizers)
23+
* - :doc:`scipy`
24+
- Scipy.optimize algorithms for continuous spaces (7 optimizers)
2325
* - :doc:`sklearn`
2426
- sklearn-compatible search interfaces (2 optimizers)
2527

@@ -28,4 +30,5 @@ Hyperactive provides optimizers from three backends:
2830

2931
gfo
3032
optuna
33+
scipy
3134
sklearn
Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
.. _optimizers_scipy_ref:
2+
3+
Scipy
4+
=====
5+
6+
.. currentmodule:: hyperactive.opt
7+
8+
The Scipy backend provides an interface to `scipy.optimize <https://docs.scipy.org/doc/scipy/reference/optimize.html>`_
9+
algorithms for continuous parameter optimization.
10+
11+
.. note::
12+
13+
Scipy optimizers only support **continuous parameter spaces** (tuples).
14+
For discrete or categorical parameters, use GFO or Optuna backends.
15+
16+
Global Optimizers
17+
-----------------
18+
19+
.. autosummary::
20+
:toctree: ../auto_generated/
21+
:template: class.rst
22+
23+
ScipyDifferentialEvolution
24+
ScipyDualAnnealing
25+
ScipyBasinhopping
26+
ScipySHGO
27+
ScipyDirect
28+
29+
Local Optimizers
30+
----------------
31+
32+
.. autosummary::
33+
:toctree: ../auto_generated/
34+
:template: class.rst
35+
36+
ScipyNelderMead
37+
ScipyPowell

docs/source/examples.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@ on GitHub.
1818
examples/population_based
1919
examples/sequential_model_based
2020
examples/optuna_backend
21+
examples/scipy_backend
2122
examples/sklearn_backend
2223
examples/integrations
2324
examples/other
@@ -61,6 +62,10 @@ Backend Examples
6162
Examples using Optuna's samplers including TPE, CMA-ES, NSGA-II/III,
6263
and Gaussian Process optimization.
6364

65+
:ref:`examples_scipy_backend`
66+
Examples using scipy.optimize algorithms including Differential Evolution,
67+
Dual Annealing, Basin-hopping, SHGO, DIRECT, Nelder-Mead, and Powell.
68+
6469
:ref:`examples_sklearn_backend`
6570
Scikit-learn compatible interfaces as drop-in replacements for
6671
GridSearchCV and RandomizedSearchCV.
Lines changed: 104 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,104 @@
1+
.. _examples_scipy_backend:
2+
3+
=============
4+
Scipy Backend
5+
=============
6+
7+
Hyperactive provides wrappers for scipy.optimize algorithms, enabling
8+
well-tested, production-grade optimization for continuous parameter spaces.
9+
10+
.. note::
11+
12+
Scipy must be installed separately:
13+
14+
.. code-block:: bash
15+
16+
pip install scipy
17+
# or
18+
pip install hyperactive[all_extras]
19+
20+
21+
Available Optimizers
22+
--------------------
23+
24+
The Scipy backend provides 7 optimizers divided into global and local methods.
25+
26+
**Global Optimizers** (5 algorithms):
27+
28+
.. list-table::
29+
:header-rows: 1
30+
:widths: 30 70
31+
32+
* - Optimizer
33+
- Description
34+
* - ``ScipyDifferentialEvolution``
35+
- Population-based global optimizer. Robust for multi-modal landscapes.
36+
* - ``ScipyDualAnnealing``
37+
- Combines classical simulated annealing with local search.
38+
* - ``ScipyBasinhopping``
39+
- Random perturbations with local minimization. Good for finding global minima.
40+
* - ``ScipySHGO``
41+
- Simplicial Homology Global Optimization. Finds multiple local minima.
42+
* - ``ScipyDirect``
43+
- Deterministic DIRECT algorithm. No random seed required.
44+
45+
**Local Optimizers** (2 algorithms):
46+
47+
.. list-table::
48+
:header-rows: 1
49+
:widths: 30 70
50+
51+
* - Optimizer
52+
- Description
53+
* - ``ScipyNelderMead``
54+
- Simplex-based optimizer. Fast for smooth functions.
55+
* - ``ScipyPowell``
56+
- Conjugate direction method. Often faster than Nelder-Mead.
57+
58+
59+
Quick Example
60+
-------------
61+
62+
Scipy optimizers require continuous parameter spaces defined as tuples:
63+
64+
.. code-block:: python
65+
66+
from hyperactive.opt.scipy import ScipyDifferentialEvolution
67+
68+
# Define a continuous search space (tuples, not arrays)
69+
param_space = {
70+
"x": (-5.0, 5.0),
71+
"y": (-5.0, 5.0),
72+
}
73+
74+
def objective(params):
75+
x, y = params["x"], params["y"]
76+
return -(x**2 + y**2) # Maximize (minimize negative)
77+
78+
optimizer = ScipyDifferentialEvolution(
79+
param_space=param_space,
80+
n_iter=100,
81+
experiment=objective,
82+
random_state=42,
83+
)
84+
85+
best_params = optimizer.solve()
86+
print(f"Best parameters: {best_params}")
87+
88+
89+
When to Use Scipy Backend
90+
-------------------------
91+
92+
The Scipy backend is useful when:
93+
94+
- **Continuous parameters only**: Your search space has no categorical or discrete values
95+
- **Production-grade algorithms**: You need well-tested, reliable implementations
96+
- **Specific scipy features**: You want scipy's differential evolution or simulated annealing
97+
- **Deterministic optimization**: Use ``ScipyDirect`` for reproducible results without random seeds
98+
99+
100+
See Also
101+
--------
102+
103+
- :ref:`user_guide_optimizers_scipy` - Detailed guide with all optimizer examples
104+
- :ref:`optimizers_scipy_ref` - API reference for all Scipy optimizers

docs/source/user_guide/optimizers/index.rst

Lines changed: 19 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
Optimizers
55
==========
66

7-
Hyperactive provides 31 algorithms across 5 categories and 3 backends.
7+
Hyperactive provides 38 algorithms across 5 categories and 4 backends.
88
Optimizers navigate the search space to find optimal parameters. Each implements a
99
different strategy for balancing exploration (trying diverse regions) and exploitation
1010
(refining promising solutions). Local search methods like Hill Climbing work well for
@@ -20,10 +20,10 @@ Algorithm Landscape
2020

2121
<div class="theme-aware-diagram">
2222
<img src="../../_static/diagrams/optimizer_taxonomy_light.svg"
23-
alt="Hyperactive optimizer taxonomy showing 31 algorithms across GFO, Optuna, and sklearn backends"
23+
alt="Hyperactive optimizer taxonomy showing 38 algorithms across GFO, Optuna, Scipy, and sklearn backends"
2424
class="only-light" />
2525
<img src="../../_static/diagrams/optimizer_taxonomy_dark.svg"
26-
alt="Hyperactive optimizer taxonomy showing 31 algorithms across GFO, Optuna, and sklearn backends"
26+
alt="Hyperactive optimizer taxonomy showing 38 algorithms across GFO, Optuna, Scipy, and sklearn backends"
2727
class="only-dark" />
2828
</div>
2929

@@ -133,6 +133,17 @@ Algorithm Categories
133133

134134
*TPEOptimizer, CmaEsOptimizer, GPOptimizer, NSGAIIOptimizer, and more*
135135

136+
.. grid-item-card:: Scipy Backend
137+
:link: scipy
138+
:link-type: doc
139+
:class-card: sd-border-secondary
140+
141+
**7 algorithms**
142+
^^^
143+
Scipy.optimize algorithms for continuous parameter spaces.
144+
145+
*DifferentialEvolution, DualAnnealing, Basinhopping, SHGO, Direct, NelderMead, Powell*
146+
136147
----
137148

138149
Scenario Reference
@@ -163,8 +174,11 @@ Detailed recommendations based on problem characteristics:
163174
- ``GridSearch``
164175
- Exhaustive coverage when feasible
165176
* - Continuous parameters
166-
- ``BayesianOptimizer``, ``CmaEsOptimizer``
177+
- ``BayesianOptimizer``, ``CmaEsOptimizer``, ``ScipyDifferentialEvolution``
167178
- Designed for smooth, continuous spaces
179+
* - Continuous only (scipy)
180+
- ``ScipyDualAnnealing``, ``ScipyBasinhopping``, ``ScipyNelderMead``
181+
- Production-grade scipy.optimize implementations
168182
* - Mixed parameter types
169183
- ``TPEOptimizer``, ``RandomSearch``
170184
- Handle categorical + continuous well
@@ -191,4 +205,5 @@ All optimizers share common parameters and configuration options.
191205
population_based
192206
sequential_model_based
193207
optuna
208+
scipy
194209
configuration

0 commit comments

Comments
 (0)