You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Error: Could not create backend
Caused by:
Could not start backend: cannot find tensor embeddings.word_embeddings.weight
(for TEI 1.2 / 1.4, it throws a different tokenizer.json error)
Expected behavior
Expected the model to be served successfully since its base model BAAI/bge-m3 can be served with TEI, and the model has text-embeddings-inference tag on the model card page.
The text was updated successfully, but these errors were encountered:
System Info
Tested with TEI 1.2, 1.4, and latest (ghcr.io/huggingface/text-embeddings-inference:cuda-latest)
OS: Docker on Debian 12
Model: dophys/bge-m3_finetuned
Hardware: 1 NVIDIA_L4
Information
Tasks
Reproduction
Got error:
(for TEI 1.2 / 1.4, it throws a different tokenizer.json error)
Expected behavior
Expected the model to be served successfully since its base model BAAI/bge-m3 can be served with TEI, and the model has
text-embeddings-inference
tag on the model card page.The text was updated successfully, but these errors were encountered: