Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for sigma=0 in normally distributed hyperparameters #285

Closed
wants to merge 7 commits into from

Conversation

nchristensen
Copy link

@nchristensen nchristensen commented Jan 5, 2023

Moving this to separate pull request per #280

This pull request does mainly two things:

  1. It loosens the type restriction to allow floating type values for mu. For instance mu=4.5 might be chosen if one desires four and five to be chosen with equal probability. This required changing default_value to be mu rounded to the nearest integer when not otherwise specified.
  2. It allows (or rather handles more correctly) sigma=0 in NormalIntegerHyperparameter objects. This means the distribution is a Dirac delta function and the only value returned from the distribution is the mean (mu). If the mu is not an integer then an error is raised during instantiation.

Copy link
Contributor

@mfeurer mfeurer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for pulling out this change. Could you please add some comments in this PR on the changes (so it will be easy later to understand this PR) and add some comments (at least where I requested them)?

ConfigSpace/hyperparameters.pyx Show resolved Hide resolved
@codecov
Copy link

codecov bot commented Jan 10, 2023

Codecov Report

Base: 67.97% // Head: 67.97% // No change to project coverage 👍

Coverage data is based on head (88051eb) compared to base (88051eb).
Patch has no changes to coverable lines.

❗ Current head 88051eb differs from pull request most recent head a8b1d6d. Consider uploading reports for the commit a8b1d6d to get more accurate results

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #285   +/-   ##
=======================================
  Coverage   67.97%   67.97%           
=======================================
  Files          25       25           
  Lines        1786     1786           
=======================================
  Hits         1214     1214           
  Misses        572      572           

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@eddiebergman
Copy link
Contributor

Hi, this PR is out of date and will entirely be so once #346 is merged. As a result, I will close this but thank you for the PR!

However I was still wondering why there's a need for a dirac delta function, given this is equivalent to a Constant?

@nchristensen
Copy link
Author

nchristensen commented Apr 16, 2024

The idea was to autotune with ytopt using certain fixed values that might change later on. For example, autotuning the multiplication of a 32 x 32 matrix and a 32 x 1000 matrix at first and then later autotuning the multiplication of a 32 x 32 and 32 x 5000 matrix (but using the data from the first case to inform initial model).

I think I tried using Constant, but it either ConfigSpace or ytopt didn't like it when I changed the value of the Constant between executions. I ended up taking a different approach (essentially, only tuning the largest matrix dimensions).

@eddiebergman
Copy link
Contributor

Thanks for the use case! Would it ha e helped if you could easily copy and modify a Configuration space between executions? A lot of ConfigSpace relies on hyperparameter's attributes remaining fixed but there's no real API for mutating spaces or hyperparameter's which I guess might have helped here

@nchristensen
Copy link
Author

Yeah, that likely would have been helpful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants