You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While using Bedrock with Claude Sonnet 3.5 v2, I see a lot of the following in my application:
openai.APIError: litellm.APIConnectionError: Bad response code, expected 200: {'status_code': 400, 'headers': {':exception-type': 'serviceUnavailableException', ':content-type': 'application/json', ':message-type': 'exception'}, 'body': b'{"message":"Bedrock is unable to process your request."}'}
What happened?
While using Bedrock with Claude Sonnet 3.5 v2, I see a lot of the following in my application:
The combination of a 400 and ServiceUnavailable seems to be impossible and the error message is quite useless. https://docs.aws.amazon.com/bedrock/latest/userguide/troubleshooting-api-error-codes.html
These errors occur seemingly randomly, and moments later the same requests works fine. I have configured fallbacks in my configuration:
I also tested
The documentation on fallbacks this way is quite sparse.
I ran LiteLLM in debug mode with the most verbose logging I have seen in a while, but I don't see a mention of retries.
Having debugged this for days now, I am not sure what is left.
Relevant log output
No response
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.57.3
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: