22Let's start with something easy to get our hands dirty.
33I want to build a surrogate for `` f(x) = \log(x) \cdot x^2+x^3 `` .
44Let's choose the radial basis surrogate.
5+
56``` @example
67using Surrogates
78f = x -> log(x)*x^2+x^3
@@ -14,7 +15,9 @@ my_radial_basis = RadialBasis(x,y,lb,ub)
1415#I want an approximation at 5.4
1516approx = my_radial_basis(5.4)
1617```
18+
1719Let's now see an example in 2D.
20+
1821``` @example
1922using Surrogates
2023using LinearAlgebra
@@ -34,6 +37,7 @@ Let's now use the Kriging surrogate, which is a single-output Gaussian process.
3437This surrogate has a nice feature: not only does it approximate the solution at a
3538point, it also calculates the standard error at such point.
3639Let's see an example:
40+
3741``` @example kriging
3842using Surrogates
3943f = x -> exp(x)*x^2+x^3
@@ -52,16 +56,19 @@ std_err = std_error_at_point(my_krig,5.4)
5256```
5357
5458Let's now optimize the Kriging surrogate using Lower confidence bound method, this is just a one-liner:
59+
5560``` @example kriging
5661surrogate_optimize(f,LCBS(),lb,ub,my_krig,UniformSample())
5762```
63+
5864Surrogate optimization methods have two purposes: they both sample the space in unknown regions and look for the minima at the same time.
5965
6066## Lobachevsky integral
6167The Lobachevsky surrogate has the nice feature of having a closed formula for its
6268integral, which is something that other surrogates are missing.
6369Let's compare it with QuadGK:
64- ``` @examples
70+
71+ ``` @example
6572using Surrogates
6673using QuadGK
6774obj = x -> 3*x + log(x)
@@ -83,6 +90,7 @@ int_val_true = int[1]-int[2]
8390
8491## Example of NeuralSurrogate
8592Basic example of fitting a neural network on a simple function of two variables.
93+
8694``` @example
8795using Surrogates
8896using Flux
0 commit comments