Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error not reported correctly by hf_raise_for_status in create_inference_endpoint #2510

Closed
aymeric-roucher opened this issue Sep 4, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@aymeric-roucher
Copy link

aymeric-roucher commented Sep 4, 2024

Describe the bug

When running this snippet:

from huggingface_hub.hf_api import HfApi
from huggingface_hub import login

login("HF_TOKEN")

hf_api = HfApi()

endpoint = hf_api.create_inference_endpoint(
    "test",
    repository="meta-llama/Meta-Llama-3.1-8B",
    framework="pytorch",
    accelerator="gpu",
    instance_size="xlarge",
    instance_type="g6",
    region="us-east-1",
    vendor="aws",
)

The task parameter is left undefined.
The error I get is empty after 'Bad request:'

File "/Users/aymeric/Documents/Code/endpoint-snooper/.venv/lib/python3.12/site-packages/huggingface_hub/hf_api.py", line 7484, in create_inference_endpoint
    hf_raise_for_status(response)
  File "/Users/aymeric/Documents/Code/endpoint-snooper/.venv/lib/python3.12/site-packages/huggingface_hub/utils/_errors.py", line 358, in hf_raise_for_status
    raise BadRequestError(message, response=response) from e
huggingface_hub.utils._errors.BadRequestError:  (Request ID: WWttaQ)

Bad request:

But it should contain some useful information: if we add a print(response.text) just before running hf_raise_for_status(response) in create_inference_endpoint we get precious info : Failed to parse the request body as JSON: model.task: expected value at line 1 column 257 that allows to correct the issue.

So the error seems improperly handled (error 400) in hf_raise_for_status, causing it to lose the content of the error.

Reproduction

No response

Logs

No response

System info

- huggingface_hub version: 0.24.6
- Platform: macOS-14.1-arm64-arm-64bit
- Python version: 3.12.0
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Token path ?: /Users/aymeric/.cache/huggingface/token
- Has saved token ?: True
- Who am I ?: m-ric
- Configured git credential helpers: osxkeychain, store
- FastAI: N/A
- Tensorflow: N/A
- Torch: N/A
- Jinja2: N/A
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: N/A
- hf_transfer: N/A
- gradio: N/A
- tensorboard: N/A
- numpy: N/A
- pydantic: N/A
- aiohttp: N/A
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: /Users/aymeric/.cache/huggingface/hub
- HF_ASSETS_CACHE: /Users/aymeric/.cache/huggingface/assets
- HF_TOKEN_PATH: /Users/aymeric/.cache/huggingface/token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
@aymeric-roucher aymeric-roucher added the bug Something isn't working label Sep 4, 2024
@Wauplin
Copy link
Contributor

Wauplin commented Sep 4, 2024

Hi @aymeric-roucher, thanks for reporting this! As a matter of fact, it has been a annoying issue since quite some time but we've fixed it since 2 weeks ago in #2474. It will be shipped in next release.

@Wauplin Wauplin closed this as completed Sep 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants