You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was just being curious if there is a built in possibility for parameter optimization (e.g. for theta and p in the kriging model) or if this is usually performed with an external library?
Best regards,
Jan
The text was updated successfully, but these errors were encountered:
That's not quite the full story. Hyperparameter optimization is traditionally done with a derivative-free Bayesian method because many libraries are not differentiable and thus standard derivative-based optimization techniques cannot be used. We have regressed a bit (https://github.com/SciML/Surrogates.jl/blob/master/test/runtests.jl#L26) but there was a time when the library was fully differentiable. We haven't written the paper on Surrogates.jl yet, but the plan is for it to be a fully differentiable surrogate library to allow for derivative-based hyperparameter optimization (and I'll be inviting all contributors to be authors BTW). We're not quite there yet, but that's the goal
Hello everyone,
I was just being curious if there is a built in possibility for parameter optimization (e.g. for theta and p in the kriging model) or if this is usually performed with an external library?
Best regards,
Jan
The text was updated successfully, but these errors were encountered: