You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm encountering an issue after updating the version of LlamaIndex in our project (SEC Insights). The problem specifically relates to sub-questions not being displayed on the frontend.
Steps to Reproduce:
Update the LlamaIndex to the latest version.
Use OpenAIAgent with SubQuestionQueryEngine.
Trigger sub-questions during a query.
Expected Behavior: The CBEventType.SUB_QUESTION should be triggered inside ChatCallbackHandler, allowing sub-questions and their responses to be streamed and displayed on the frontend.
Observed Behavior: After updating the version, the CBEventType.SUB_QUESTION is not triggered, preventing subquery answers from streaming to the frontend.
Additional Context: This issue is impacting the ability to handle sub-questions in a live environment. Has anyone else encountered this problem, or are there any workarounds?
Environment:
LlamaIndex Version: "^0.11.13"
Llama Index Agent Openai: "^0.3.4"
The text was updated successfully, but these errors were encountered:
Description:
I'm encountering an issue after updating the version of LlamaIndex in our project (SEC Insights). The problem specifically relates to sub-questions not being displayed on the frontend.
Steps to Reproduce:
Update the LlamaIndex to the latest version.
Use OpenAIAgent with SubQuestionQueryEngine.
Trigger sub-questions during a query.
Expected Behavior: The CBEventType.SUB_QUESTION should be triggered inside ChatCallbackHandler, allowing sub-questions and their responses to be streamed and displayed on the frontend.
Observed Behavior: After updating the version, the CBEventType.SUB_QUESTION is not triggered, preventing subquery answers from streaming to the frontend.
Additional Context: This issue is impacting the ability to handle sub-questions in a live environment. Has anyone else encountered this problem, or are there any workarounds?
Environment:
LlamaIndex Version: "^0.11.13"
Llama Index Agent Openai: "^0.3.4"
The text was updated successfully, but these errors were encountered: