-
-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Making Kriging differentiable and better initial hyperparams #368
Comments
Some further improvement ideas:
|
Oh nice, can you open a PR? Note a similar discussion is occurring in #366
That sounds like a good idea. Smooth square exponential kernel is more expected as a default too. |
Made a PR for the differentiability. For the initial hyperparams, I propose: The latter comes from A Practical Guide to Gaussian Processes and the interpretation of Sphere functionL1-norm functionBranin functionRosenbrockClearly, the new hyperparameters are a significant improvement. I'll make a PR with this, but we'd also want to update all of the documentation examples using Kriging to use the new default hyperparameters as the current examples don't exactly give the most confidence in the method's performance. |
Fixed by #374 |
In order to make hyperparameter optimization of gaussian process models simpler (cf #328 and #328), it would be nice to be able to use AD on kriging surrogates. This turns out not to be too difficult. Making only a single change to the computation of Kriging coefficients makes these surrogates differentiable with respect to their hyperparameters
Setup:
Just to check that this works, here's how the model performs
Now let's try taking a gradient with respect to our hyperparameters$p, \theta$
We get an error (
ERROR: Mutating arrays is not supported -- called setindex!(Matrix{Float64}, ...)
) which comes from the_calc_kriging_coeffs
function when we build the covariance matrixR
. We can replace the mutating part of this function with a matrix comprehension as follows:old:
new:
With that, we can compute a gradient!
The text was updated successfully, but these errors were encountered: