Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sub-questions not being triggered in ChatCallbackHandler using lastest version LlamaIndex OpenAIAgent with SubQuestionQueryEngines #119

Open
byamasu-patrick opened this issue Oct 1, 2024 · 0 comments

Comments

@byamasu-patrick
Copy link

Description:

I'm encountering an issue after updating the version of LlamaIndex in our project (SEC Insights). The problem specifically relates to sub-questions not being displayed on the frontend.

Steps to Reproduce:

Update the LlamaIndex to the latest version.
Use OpenAIAgent with SubQuestionQueryEngine.
Trigger sub-questions during a query.
Expected Behavior: The CBEventType.SUB_QUESTION should be triggered inside ChatCallbackHandler, allowing sub-questions and their responses to be streamed and displayed on the frontend.

Observed Behavior: After updating the version, the CBEventType.SUB_QUESTION is not triggered, preventing subquery answers from streaming to the frontend.

Additional Context: This issue is impacting the ability to handle sub-questions in a live environment. Has anyone else encountered this problem, or are there any workarounds?

Environment:

LlamaIndex Version: "^0.11.13"
Llama Index Agent Openai: "^0.3.4"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant