Skip to content

Commit 867dd7a

Browse files
Merge pull request #511 from sathvikbhagavan/sb/opt
refactor: add ! in surrogate_optimize as it is mutating
2 parents 5d70700 + 38b2cac commit 867dd7a

22 files changed

+88
-85
lines changed

docs/src/InverseDistance.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -51,10 +51,10 @@ plot!(InverseDistance, label = "Surrogate function",
5151

5252
Having built a surrogate, we can now use it to search for minima in our original function `f`.
5353

54-
To optimize using our surrogate we call `surrogate_optimize` method. We choose to use Stochastic RBF as the optimization technique and again Sobol sampling as the sampling technique.
54+
To optimize using our surrogate we call `surrogate_optimize!` method. We choose to use Stochastic RBF as the optimization technique and again Sobol sampling as the sampling technique.
5555

5656
```@example Inverse_Distance1D
57-
surrogate_optimize(
57+
surrogate_optimize!(
5858
f, SRBF(), lower_bound, upper_bound, InverseDistance, SobolSample())
5959
scatter(x, y, label = "Sampled points", legend = :top)
6060
plot!(f, label = "True function", xlims = (lower_bound, upper_bound), legend = :top)
@@ -132,7 +132,7 @@ size(xys)
132132
```
133133

134134
```@example Inverse_DistanceND
135-
surrogate_optimize(schaffer, SRBF(), lower_bound, upper_bound,
135+
surrogate_optimize!(schaffer, SRBF(), lower_bound, upper_bound,
136136
InverseDistance, SobolSample(), maxiters = 10)
137137
```
138138

docs/src/LinearSurrogate.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -50,10 +50,10 @@ plot!(my_linear_surr_1D, label = "Surrogate function", xlims = (lower_bound, upp
5050

5151
Having built a surrogate, we can now use it to search for minima in our original function `f`.
5252

53-
To optimize using our surrogate we call `surrogate_optimize` method. We choose to use Stochastic RBF as the optimization technique and again Sobol sampling as the sampling technique.
53+
To optimize using our surrogate we call `surrogate_optimize!` method. We choose to use Stochastic RBF as the optimization technique and again Sobol sampling as the sampling technique.
5454

5555
```@example linear_surrogate1D
56-
surrogate_optimize(
56+
surrogate_optimize!(
5757
f, SRBF(), lower_bound, upper_bound, my_linear_surr_1D, SobolSample())
5858
scatter(x, y, label = "Sampled points")
5959
plot!(f, label = "True function", xlims = (lower_bound, upper_bound))
@@ -130,7 +130,7 @@ size(xys)
130130
```
131131

132132
```@example linear_surrogateND
133-
surrogate_optimize(
133+
surrogate_optimize!(
134134
egg, SRBF(), lower_bound, upper_bound, my_linear_ND, SobolSample(), maxiters = 10)
135135
```
136136

docs/src/abstractgps.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -56,7 +56,7 @@ xs = lower_bound:0.1:upper_bound
5656
x = sample(n_samples, lower_bound, upper_bound, SobolSample())
5757
y = f.(x)
5858
gp_surrogate = AbstractGPSurrogate(x, y)
59-
surrogate_optimize(f, SRBF(), lower_bound, upper_bound, gp_surrogate, SobolSample())
59+
surrogate_optimize!(f, SRBF(), lower_bound, upper_bound, gp_surrogate, SobolSample())
6060
```
6161

6262
Plotting the function and the sampled points:

docs/src/ackley.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ The fit looks good. Let's now see if we are able to find the minimum value using
5555
optimization methods:
5656

5757
```@example ackley
58-
surrogate_optimize(ackley, DYCORS(), lb, ub, my_rad, RandomSample())
58+
surrogate_optimize!(ackley, DYCORS(), lb, ub, my_rad, RandomSample())
5959
scatter(x, y, label = "Sampled points", xlims = (lb, ub), ylims = (0, 30), legend = :top)
6060
plot!(xs, ackley.(xs), label = "True function", legend = :top)
6161
plot!(xs, my_rad.(xs), label = "Radial basis optimized", legend = :top)

docs/src/gekpls.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -78,7 +78,7 @@ x = sample(n, lb, ub, SobolSample())
7878
grads = gradient.(sphere_function, x)
7979
y = sphere_function.(x)
8080
g = GEKPLS(x, y, grads, n_comp, delta_x, lb, ub, extra_points, initial_theta)
81-
x_point, minima = surrogate_optimize(sphere_function, SRBF(), lb, ub, g,
81+
x_point, minima = surrogate_optimize!(sphere_function, SRBF(), lb, ub, g,
8282
RandomSample(); maxiters = 20,
8383
num_new_samples = 20, needs_gradient = true)
8484
minima

docs/src/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -114,7 +114,7 @@ my_lobachevsky = LobachevskySurrogate(x, y, lb, ub, alpha = alpha, n = n)
114114
value = my_lobachevsky(5.0)
115115
116116
#Adding more data points
117-
surrogate_optimize(f, SRBF(), lb, ub, my_lobachevsky, RandomSample())
117+
surrogate_optimize!(f, SRBF(), lb, ub, my_lobachevsky, RandomSample())
118118
119119
#New approximation
120120
value = my_lobachevsky(5.0)

docs/src/kriging.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -52,10 +52,10 @@ plot!(xs, kriging_surrogate.(xs), label = "Surrogate function",
5252

5353
Having built a surrogate, we can now use it to search for minima in our original function `f`.
5454

55-
To optimize using our surrogate, we call `surrogate_optimize` method. We choose to use Stochastic RBF as the optimization technique and again Sobol sampling as the sampling technique.
55+
To optimize using our surrogate, we call `surrogate_optimize!` method. We choose to use Stochastic RBF as the optimization technique and again Sobol sampling as the sampling technique.
5656

5757
```@example kriging_tutorial1d
58-
surrogate_optimize(
58+
surrogate_optimize!(
5959
f, SRBF(), lower_bound, upper_bound, kriging_surrogate, SobolSample())
6060
6161
scatter(x, y, label = "Sampled points", ylims = (-7, 7), legend = :top)
@@ -139,7 +139,7 @@ size(xys)
139139
```
140140

141141
```@example kriging_tutorialnd
142-
surrogate_optimize(branin, SRBF(), lower_bound, upper_bound, kriging_surrogate,
142+
surrogate_optimize!(branin, SRBF(), lower_bound, upper_bound, kriging_surrogate,
143143
SobolSample(); maxiters = 100, num_new_samples = 10)
144144
```
145145

docs/src/lobachevsky.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -48,10 +48,10 @@ plot!(
4848

4949
Having built a surrogate, we can now use it to search for minima in our original function `f`.
5050

51-
To optimize using our surrogate we call `surrogate_optimize` method. We choose to use Stochastic RBF as the optimization technique and again Sobol sampling as the sampling technique.
51+
To optimize using our surrogate we call `surrogate_optimize!` method. We choose to use Stochastic RBF as the optimization technique and again Sobol sampling as the sampling technique.
5252

5353
```@example LobachevskySurrogate_tutorial
54-
surrogate_optimize(
54+
surrogate_optimize!(
5555
f, SRBF(), lower_bound, upper_bound, lobachevsky_surrogate, SobolSample())
5656
scatter(x, y, label = "Sampled points")
5757
plot!(f, label = "True function", xlims = (lower_bound, upper_bound))
@@ -132,7 +132,7 @@ size(Lobachevsky.x)
132132
```
133133

134134
```@example LobachevskySurrogate_ND
135-
surrogate_optimize(schaffer, SRBF(), lower_bound, upper_bound, Lobachevsky,
135+
surrogate_optimize!(schaffer, SRBF(), lower_bound, upper_bound, Lobachevsky,
136136
SobolSample(), maxiters = 1, num_new_samples = 10)
137137
```
138138

docs/src/multi_objective_opt.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ub = 10.0
1212
x = sample(50, lb, ub, GoldenSample())
1313
y = f.(x)
1414
my_radial_basis_ego = RadialBasis(x, y, lb, ub)
15-
pareto_set, pareto_front = surrogate_optimize(
15+
pareto_set, pareto_front = surrogate_optimize!(
1616
f, SMB(), lb, ub, my_radial_basis_ego, SobolSample(); maxiters = 10, n_new_look = 100)
1717
1818
m = 5
@@ -27,7 +27,7 @@ K = 2
2727
p_cross = 0.5
2828
n_c = 1.0
2929
sigma = 1.5
30-
surrogate_optimize(
30+
surrogate_optimize!(
3131
f, RTEA(Z, K, p_cross, n_c, sigma), lb, ub, my_radial_basis_rtea, SobolSample())
3232
```
3333

@@ -43,7 +43,7 @@ ub = [3.5, 0.5]
4343
x = sample(50, lb, ub, SobolSample())
4444
y = f.(x)
4545
my_radial_basis_ego = RadialBasis(x, y, lb, ub)
46-
#I can find my pareto set and pareto front by calling again the surrogate_optimize function:
47-
pareto_set, pareto_front = surrogate_optimize(
46+
#I can find my pareto set and pareto front by calling again the surrogate_optimize! function:
47+
pareto_set, pareto_front = surrogate_optimize!(
4848
f, SMB(), lb, ub, my_radial_basis_ego, SobolSample(); maxiters = 10, n_new_look = 100);
4949
```

docs/src/neural.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,6 +77,6 @@ plot(p1, p2, title = "Surrogate")
7777
We can now call an optimization function on the neural network:
7878

7979
```@example Neural_surrogate
80-
surrogate_optimize(schaffer, SRBF(), lower_bound, upper_bound, neural,
80+
surrogate_optimize!(schaffer, SRBF(), lower_bound, upper_bound, neural,
8181
SobolSample(), maxiters = 20, num_new_samples = 10)
8282
```

0 commit comments

Comments
 (0)