Skip to content

Commit a38e9bd

Browse files
committed
add static doc content for scipy optimizers
1 parent 30ab43f commit a38e9bd

File tree

3 files changed

+115
-0
lines changed

3 files changed

+115
-0
lines changed

docs/source/api_reference/optimizers.rst

Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -115,6 +115,57 @@ These optimizers provide an interface to Optuna's optimization algorithms.
115115
NSGAIIIOptimizer
116116
QMCOptimizer
117117

118+
Scipy-Based Optimizers
119+
-----------------------
120+
121+
These optimizers provide an interface to scipy.optimize algorithms.
122+
They support **continuous parameter spaces only** (tuples). For discrete
123+
or categorical parameters, use Optuna or GFO backends.
124+
125+
Global Optimizers
126+
~~~~~~~~~~~~~~~~~
127+
128+
Global optimization algorithms for finding global optima.
129+
130+
.. autosummary::
131+
:toctree: auto_generated/
132+
:template: class.rst
133+
134+
ScipyDifferentialEvolution
135+
ScipyDualAnnealing
136+
ScipyBasinhopping
137+
ScipySHGO
138+
ScipyDirect
139+
140+
Local Optimizers
141+
~~~~~~~~~~~~~~~~
142+
143+
Derivative-free local optimization algorithms.
144+
145+
.. autosummary::
146+
:toctree: auto_generated/
147+
:template: class.rst
148+
149+
ScipyNelderMead
150+
ScipyPowell
151+
152+
SMAC-Based Optimizers
153+
---------------------
154+
155+
These optimizers provide an interface to SMAC3's Bayesian optimization algorithms.
156+
SMAC3 (Sequential Model-based Algorithm Configuration) uses Random Forest or
157+
Gaussian Process surrogate models for efficient hyperparameter optimization.
158+
159+
Install with: ``pip install smac`` or ``pip install hyperactive[smac]``
160+
161+
.. autosummary::
162+
:toctree: auto_generated/
163+
:template: class.rst
164+
165+
SmacRandomForest
166+
SmacGaussianProcess
167+
SmacRandomSearch
168+
118169
Scikit-Learn Style
119170
-------------------
120171

docs/source/user_guide/optimizers.rst

Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -35,6 +35,9 @@ The best optimizer depends on your problem characteristics:
3535
* - Small search space
3636
- ``GridSearch``
3737
- Exhaustive coverage when feasible
38+
* - Continuous params only
39+
- ``ScipyDifferentialEvolution``, ``ScipyDualAnnealing``
40+
- Well-tested scipy implementations
3841

3942

4043
Optimizer Categories
@@ -264,6 +267,39 @@ Example with Optuna TPE:
264267
:end-before: # [end:optuna_tpe]
265268

266269

270+
Scipy Backend
271+
^^^^^^^^^^^^^
272+
273+
Scipy optimizers are well-tested implementations for **continuous parameter spaces only**.
274+
They do not support discrete or categorical parameters.
275+
276+
Available scipy optimizers:
277+
278+
- **Global**: ``ScipyDifferentialEvolution``, ``ScipyDualAnnealing``,
279+
``ScipyBasinhopping``, ``ScipySHGO``, ``ScipyDirect``
280+
- **Local**: ``ScipyNelderMead``, ``ScipyPowell``
281+
282+
Example with Scipy Differential Evolution:
283+
284+
.. code-block:: python
285+
286+
from hyperactive.opt.scipy import ScipyDifferentialEvolution
287+
288+
optimizer = ScipyDifferentialEvolution(
289+
param_space={
290+
"learning_rate": (0.0001, 0.1),
291+
"momentum": (0.5, 0.99),
292+
},
293+
n_iter=100,
294+
strategy="best1bin",
295+
random_state=42,
296+
experiment=experiment,
297+
)
298+
best_params = optimizer.solve()
299+
300+
For discrete or categorical parameters, use GFO or Optuna backends instead.
301+
302+
267303
Optimizer Configuration
268304
-----------------------
269305

docs/source/user_guide/search_spaces.rst

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -82,6 +82,34 @@ For parameters that vary continuously, use NumPy to create arrays:
8282
though Hyperactive accepts both formats.
8383

8484

85+
Scipy Backend: Tuple Ranges
86+
^^^^^^^^^^^^^^^^^^^^^^^^^^^
87+
88+
Scipy optimizers use a different format: tuples ``(low, high)`` for truly continuous
89+
optimization (not discretized):
90+
91+
.. code-block:: python
92+
93+
from hyperactive.opt.scipy import ScipyDifferentialEvolution
94+
95+
# Scipy uses tuples for continuous bounds
96+
param_space = {
97+
"learning_rate": (0.0001, 0.1), # Continuous range
98+
"momentum": (0.5, 0.99), # Continuous range
99+
}
100+
101+
optimizer = ScipyDifferentialEvolution(
102+
param_space=param_space,
103+
n_iter=100,
104+
experiment=experiment,
105+
)
106+
107+
.. warning::
108+
109+
Scipy optimizers do **not** support lists or categorical values.
110+
Use GFO or Optuna backends for discrete/categorical parameters.
111+
112+
85113
Scale-Appropriate Spacing
86114
-------------------------
87115

0 commit comments

Comments
 (0)