Skip to content

Commit

Permalink
Merge pull request #336 from vikram-s-narayan/update-abstractgps-docs
Browse files Browse the repository at this point in the history
add note about need for hyperoptimization in  abstractgps  surrogate
  • Loading branch information
ChrisRackauckas authored May 1, 2022
2 parents 18ca647 + 323334f commit 8d4ac35
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion docs/src/abstractgps.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,10 @@

Gaussian Process regression in Surrogates.jl is implemented as a simple wrapper around the [AbstractGPs.jl](https://github.com/JuliaGaussianProcesses/AbstractGPs.jl) package. AbstractGPs comes with a variety of covariance functions (kernels). See [KernelFunctions.jl](https://github.com/JuliaGaussianProcesses/KernelFunctions.jl/) for examples.


!!! note
The examples below demonstrate the use of AbstractGPs with out-of-the-box settings without hyperparameter optimization (i.e. without changing parameters like lengthscale, signal variance and noise variance.) Beyond hyperparameter optimization, careful initialization of hyperparameters and priors on the parameters is required for this surrogate to work properly. For more details on how to fit GPs in practice, check out [A Practical Guide to Gaussian Processes](https://infallible-thompson-49de36.netlify.app/).

Also see this [example](https://juliagaussianprocesses.github.io/AbstractGPs.jl/stable/examples/1-mauna-loa/#Hyperparameter-Optimization) to understand hyperparameter optimization with AbstractGPs.
## 1D Example
In the example below, the 'gp_surrogate' assignment code can be commented / uncommented to see how the different kernels influence the predictions.

Expand Down

0 comments on commit 8d4ac35

Please sign in to comment.