You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed, that the linear probing is trained with both the validation and training dataset when using weight-decay tuning. train_loader = feature_train_val_loader
This could lead to training the linear probe with more data per class when considering the few-shot setting, because the code only filters the training data [Line 235] but not the validation dataset.
So at least for the few-shot setting, it would be necessary to use only the train_loader to train the classification head of the linear probe.
The text was updated successfully, but these errors were encountered:
MarcoMorik
changed the title
[BUG] Leaking Validation Data in Few-Shot-Setting
Leaking Validation Data in Few-Shot-Setting
May 3, 2024
I noticed, that the linear probing is trained with both the validation and training dataset when using weight-decay tuning.
train_loader = feature_train_val_loader
CLIP_benchmark/clip_benchmark/metrics/linear_probe.py
Line 280 in a230282
This could lead to training the linear probe with more data per class when considering the few-shot setting, because the code only filters the training data [Line 235] but not the validation dataset.
So at least for the few-shot setting, it would be necessary to use only the train_loader to train the classification head of the linear probe.
The text was updated successfully, but these errors were encountered: