You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Request to integrate LangSmith's OpenTelemetry tracing capabilities with our LlamaIndex implementation to enable comprehensive monitoring and debugging of LLM operations. Currently experiencing authentication issues when attempting to export traces to LangSmith's OpenTelemetry endpoint.
Current implementation attempts to:
Initialize Traceloop SDK with LangSmith's OpenTelemetry endpoint
Configure custom headers for authentication
Track LlamaIndex operations with Azure OpenAI
Export traces to LangSmith for monitoring and analysis
The current implementation is failing due to authentication issues, preventing proper trace export to LangSmith. This limits our ability to:
Monitor LLM operations in real-time
Debug performance issues
Track token usage and costs
Analyze query patterns and responses
ERROR:opentelemetry.exporter.otlp.proto.http.trace_exporter:Failed to export batch code: 401, reason: {"error":"Unauthorized"}
Value of Feature
Value of Feature
Integration with LangSmith's OpenTelemetry would provide:
Enhanced Observability:
Real-time monitoring of LLM operations
Detailed trace information for debugging
Performance metrics and bottleneck identification
Operational Benefits:
Improved debugging capabilities
Better understanding of token usage and costs
Ability to optimize query patterns
Enhanced monitoring of production deployments
Development Efficiency:
Faster issue resolution
Better understanding of system behavior
Improved development workflow with trace visualization
Code integration
# Initialize Traceloop SDK and LlamaIndex with Azure OpenAI# This script sets up tracing and testing of the LlamaIndex integration with Azure OpenAIimportosfromtraceloop.sdkimportTraceloop# Get LangSmith API key from environment variables for tracingLANGSMITH_API_KEY=os.getenv("LANGCHAIN_API_KEY")
# Initialize Traceloop with LangSmith endpoint and authentication# - api_endpoint: LangSmith OTEL endpoint for trace collection# - headers: Authentication and content type headers# - disable_batch: Send traces immediately without batching# - app_name: Name of the application for trace identificationTraceloop.init(api_endpoint="https://api.smith.langchain.com/otel",
headers=
{
"x-api-key": LANGSMITH_API_KEY,
"content-type": "application/protobuf"},
disable_batch=True,
app_name="testt",
)
# Import required LlamaIndex componentsfromllama_index.coreimportVectorStoreIndex, Documentfromllama_index.llms.azure_openaiimportAzureOpenAIfromllama_index.embeddings.azure_openaiimportAzureOpenAIEmbeddingfromllama_index.coreimportSettingsfromapp.core.configimportget_settings# Load application settingssettings=get_settings()
# Define required Azure OpenAI settings to validate configurationrequired_settings= [
("Azure OpenAI API Key", settings.azure_openai_api_key),
("Azure OpenAI Endpoint", settings.azure_openai_endpoint),
("Azure OpenAI Deployment Name", settings.azure_openai_deployment_name),
("Azure OpenAI API Version", settings.azure_openai_api_version),
("Azure OpenAI Embeddings Name", settings.azure_openai_embeddings_name),
("Azure OpenAI Embeddings Endpoint", settings.azure_openai_embeddings_endpoint),
]
# Configure LlamaIndex to use Azure OpenAI for text generationSettings.llm=AzureOpenAI(
model=settings.text_model,
engine=settings.azure_openai_deployment_name,
deployment_name=settings.azure_openai_deployment_name,
api_key=settings.azure_openai_api_key,
azure_endpoint=settings.azure_openai_endpoint,
api_version=settings.azure_openai_api_version,
)
# Configure LlamaIndex to use Azure OpenAI for embeddingsSettings.embed_model=AzureOpenAIEmbedding(
model=settings.azure_openai_embeddings_model,
deployment_name=settings.azure_openai_embeddings_name,
api_key=settings.azure_openai_api_key,
azure_endpoint=settings.azure_openai_embeddings_endpoint,
api_version=settings.azure_openai_embeddings_api_version,
)
# Test the setup with a sample document and querytry:
# Create test index with example documentdocuments= [Document.example()]
index=VectorStoreIndex.from_documents(documents)
query_engine=index.as_query_engine()
# Run test queryresponse=query_engine.query("What is this document about?")
print(f"Query Response: {response}")
exceptExceptionase:
print(f"Error occurred: {str(e)}")
Additional Context
Currently using:
LlamaIndex with Azure OpenAI integration
Traceloop SDK for OpenTelemetry implementation
LangSmith as the target platform for trace collection
The text was updated successfully, but these errors were encountered:
Feature Description
Feature Description
Request to integrate LangSmith's OpenTelemetry tracing capabilities with our LlamaIndex implementation to enable comprehensive monitoring and debugging of LLM operations. Currently experiencing authentication issues when attempting to export traces to LangSmith's OpenTelemetry endpoint.
Current implementation attempts to:
Related Documentation
OpenTelemetry Tracing Guide for Langsmith
Reason
Reason
The current implementation is failing due to authentication issues, preventing proper trace export to LangSmith. This limits our ability to:
Value of Feature
Value of Feature
Integration with LangSmith's OpenTelemetry would provide:
Enhanced Observability:
Operational Benefits:
Development Efficiency:
Code integration
Additional Context
Currently using:
The text was updated successfully, but these errors were encountered: