-
Notifications
You must be signed in to change notification settings - Fork 981
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't load CLIP-ViT-H-14-frozen-xlm-roberta-large-laion5B-s13B-b90k #594
Comments
The same for laion/CLIP-ViT-B-32-xlm-roberta-base-laion5B-s13B-b90k |
if I try to load model without HF, this error encountered: import open_clip
model, _, preprocess = open_clip.create_model_and_transforms('xlm-roberta-large-ViT-H-14', pretrained='frozen_laion5b_s13b_b90k')
Downloading (…)lve/main/config.json: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 616/616 [00:00<00:00, 136kB/s]
Downloading model.safetensors: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.24G/2.24G [01:05<00:00, 34.4MB/s]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.8/dist-packages/open_clip/factory.py", line 308, in create_model_and_transforms
model = create_model(
File "/usr/local/lib/python3.8/dist-packages/open_clip/factory.py", line 228, in create_model
load_checkpoint(model, checkpoint_path)
File "/usr/local/lib/python3.8/dist-packages/open_clip/factory.py", line 104, in load_checkpoint
incompatible_keys = model.load_state_dict(state_dict, strict=strict)
File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 2041, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for CustomTextCLIP:
Unexpected key(s) in state_dict: "text.transformer.embeddings.position_ids". |
Downgrading my transformers version worked for me.
|
I also just put out a PR yesterday to fix this: #595
|
What about first error, when config can't be found? |
no open_clip_config.json was pushed by whoever uploaded this model, so the hf-hub method won't work as it sourced the model config from the hub instead of open_clip... |
use this instead: |
When I am trying to load this model, the following error occurred:
The text was updated successfully, but these errors were encountered: