Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug suspicion #59

Open
edubois opened this issue Sep 24, 2018 · 3 comments
Open

Bug suspicion #59

edubois opened this issue Sep 24, 2018 · 3 comments

Comments

@edubois
Copy link

edubois commented Sep 24, 2018

Hi Patrik,

I see something that looks a bit strange to me:

lambda = lambda * static_cast<float>(cv::norm(data)) / static_cast<float>(num_training_elements);

The lambda gets updated, but don't you want this method to be const and use a local copy of the lambda value? Otherwise we are changing the datamember at each call to solve. This might mean it's going smaller and smaller.

Just want to make sure.
What do you think?

@edubois edubois changed the title Bug suspision Bug suspicion Sep 24, 2018
@patrikhuber
Copy link
Owner

patrikhuber commented Sep 24, 2018

Hi Eloi,

Hmm! That's a good question. Just looking at the code /control flow, think I would actually agree with you and say that you've indeed discovered a bug. Nice! Just to make sure, I would step through the code in the debugger with a trivial example and observe the value of lambda and the regularisation matrix, and see whether what we think is happening from just reading the code is really happening "in practice" too.

If it does indeed decrease the lambda at each call to solve, it has been quite a while since I've looked at the original paper, but indeed I think that that's not what we want to do.

It's an interesting one - it would basically mean we've unknowingly decreased the regularisation further in each regressor. But that makes me think, in the RCR training, we set the lambda per-regressor:

vector<LinearRegressor<VerbosePartialPivLUSolver>> regressors;
regressors.emplace_back(LinearRegressor<VerbosePartialPivLUSolver>(Regulariser(Regulariser::RegularisationType::MatrixNorm, 1.5f, false)));
regressors.emplace_back(LinearRegressor<VerbosePartialPivLUSolver>(Regulariser(Regulariser::RegularisationType::MatrixNorm, 1.5f, false)));
regressors.emplace_back(LinearRegressor<VerbosePartialPivLUSolver>(Regulariser(Regulariser::RegularisationType::MatrixNorm, 1.5f, false)));
regressors.emplace_back(LinearRegressor<VerbosePartialPivLUSolver>(Regulariser(Regulariser::RegularisationType::MatrixNorm, 1.5f, false)));

And I think solve() is only ever called once per regressor level. Which would mean that this bug actually never occurs in practice, at least not until someone changes the code and uses it in a way that calls solve() more than once per regressor. It's really hard to tell, so as mentioned above, the best way would be to run the code and see whether what I think is correct :-)

@edubois
Copy link
Author

edubois commented Sep 25, 2018

Hi Patrick,

Thanks for your comments,

Actually I think your code is right for a bit of a tricky reason: the regularizer passed at this point:

cv::Mat solve(cv::Mat data, cv::Mat labels, Regulariser regulariser)

is passed by copy, so whatever happen to this object has no effect on the next iteration. It's a bit dangerous though.

@patrikhuber
Copy link
Owner

Right! So the lambda = lambda * ... line is performed only on the copy of the Regulariser. That makes sense. I still think though that what I wrote is also likely correct, even if we didn't pass the regulariser by value there, the bug wouldn't happen because solve() is only ever called once per regressor.

In any case, you're completely right, the best would be to change the Regulariser to work on a local copy of lambda and then pass the Regulariser by const-ref :-)

Thank you for reporting this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants