You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Anthropic updated their release. Types for prompt caching are now outdated and throw import errors:
from anthropic.types.beta.prompt_caching import (
PromptCachingBetaTextBlockParam,
PromptCachingBetaCacheControlEphemeralParam,
)
Based on, the change log, these can be updated to:
from anthropic.types import (
TextBlockParam,
CacheControlEphemeralParam,
)
Version
0.12.5
Steps to Reproduce
install llama-index (0.12.5)
install llama-index-llms-anthropic (0.6.1)
use 0.41.0 release of anthropic package (released December 17, 2024)
import llama-index-llms-anthropic package in python llama-index-llms-anthropic
Relevant Logs/Tracbacks
ropic/utils.py", line 18, in <module> from anthropic.types.beta.prompt_caching import (ModuleNotFoundError: No module named 'anthropic.types.beta.prompt_caching'
The text was updated successfully, but these errors were encountered:
Hey @mclean-connor! I'm here to help you with the issue you've encountered. I can assist with bug fixes, answer questions, and guide you on becoming a contributor. Let's work on resolving this together!
To resolve the import errors caused by the update in the anthropic package version 0.41.0, you need to update the import statements in your code. The outdated imports from anthropic.types.beta.prompt_caching should be replaced with the new imports from anthropic.types. Specifically, update your code as follows:
Bug Description
Anthropic updated their release. Types for prompt caching are now outdated and throw import errors:
Based on, the change log, these can be updated to:
Version
0.12.5
Steps to Reproduce
Relevant Logs/Tracbacks
The text was updated successfully, but these errors were encountered: