You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been trying to train multiple instances of a model with adapters and I would like it so that the adapters initialize with the exact same weights if they have the same seed.
I tried to find whether it's possible, but it seems like it isn't. Is there any way to do that?
The text was updated successfully, but these errors were encountered:
Sorry for getting back to you just now; I know this issue has been open for a while now. If I understand you correctly, you were interested in the possibility of initializing adapters with the same weights for one or multiple models, correct?
I drafted PR #786, which introduces an additional parameter for resetting the seed before every weight initialization to a specified value, which should lead to the outcome you desired.
I don't know if this is still relevant to you, but I thought it might still be an interesting feature to have when conducting research and experiments in general.
@TimoImhof thank you! I have finished working on the project that I needed this for, but I am planning on using AdapterHub in the near future, so this is still helpful :)
I've been trying to train multiple instances of a model with adapters and I would like it so that the adapters initialize with the exact same weights if they have the same seed.
I tried to find whether it's possible, but it seems like it isn't. Is there any way to do that?
The text was updated successfully, but these errors were encountered: