-
Notifications
You must be signed in to change notification settings - Fork 176
Issues: huggingface/text-embeddings-inference
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Sagemaker Asynchronous TEI Endpoints Fail On Requests Greater Than 2mb
#433
opened Nov 7, 2024 by
ma3842
2 of 4 tasks
curl: (56) Recv failure: Connection reset by peer
#432
opened Oct 26, 2024 by
Tejaswgupta
1 of 4 tasks
Run TEI model on CPU fails (says Cuda f16 and flash attention is required)
#431
opened Oct 25, 2024 by
Astlaan
2 of 4 tasks
TEI Process dying on Sagemaker Endpoint with g4dn.xlarge
#429
opened Oct 24, 2024 by
BebehCodes
3 of 4 tasks
thread 'tokio-runtime-worker' panicked at /usr/src/backends/src/lib.rs:176:14
#424
opened Oct 14, 2024 by
jackli0127
Inconsistency in how different URL paths are handled (in inference endpoints)
#398
opened Sep 4, 2024 by
MoritzLaurer
4 tasks
dunzhang/stella_en_1.5B_v5 Maximum Token Limit Set to 512 Despite Model Capabilities
#396
opened Sep 4, 2024 by
taoari
2 of 4 tasks
Input validation error:
inputs
must have less than 32000 characters. Given: 67337
#394
opened Sep 3, 2024 by
ffalkenberg
Get opentelemetry trace id from request headers instead of creating a new trace
#374
opened Aug 8, 2024 by
ptanov
Previous Next
ProTip!
Adding no:label will show everything without a label.