Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example on using LlamaIndex stream_chat() #885

Open
nick-youngblut opened this issue Nov 21, 2023 · 1 comment
Open

Example on using LlamaIndex stream_chat() #885

nick-youngblut opened this issue Nov 21, 2023 · 1 comment

Comments

@nick-youngblut
Copy link

The Streamlit docs on creating a streaming chatbot show the following example:

for response in client.chat.completions.create(
        model=st.session_state["openai_model"],
        messages=[{"role": m["role"], "content": m["content"]} for m in st.session_state.messages],
        stream=True,
    ):

...but there is no example (that I can find) of creating a streaming chat engine from an index object as shown in the LlamaIndex examples:

chat_engine = index.as_chat_engine()
streaming_response = chat_engine.stream_chat("Tell me a joke.")
for token in streaming_response.response_gen:
    print(token, end="")

If I try to use chat_engine.stream_chat with the for response in client.chat.completions.create() pattern as shown in the Streamlit docs, I get the following error RuntimeError: There is no current event loop in thread 'ScriptRunner.scriptThread'.

@nick-youngblut
Copy link
Author

An example traceback when trying to simply use:

if st.session_state.messages[-1]["role"] != "assistant":
    with st.chat_message("assistant"):
        message_placeholder = st.empty()
        full_response = ""
        for token in st.session_state.chat_engine.stream_chat(prompt):
            print(token)
        message_placeholder.markdown(full_response)
        st.session_state.messages.append({"role": "assistant", "content": full_response})

As in the Streamlit docs on creating a streaming chatbot.

Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 534, in _run_script
    exec(code, module.__dict__)
  File "/workspaces/gcp_llm/app.py", line 67, in <module>
    for token in st.session_state.chat_engine.stream_chat(prompt):
  File "/usr/local/lib/python3.9/site-packages/llama_index/callbacks/utils.py", line 39, in wrapper
    return func(self, *args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/llama_index/agent/openai_agent.py", line 444, in stream_chat
    chat_response = self._chat(
  File "/usr/local/lib/python3.9/site-packages/llama_index/agent/openai_agent.py", line 330, in _chat
    agent_chat_response = self._get_agent_response(mode=mode, **llm_chat_kwargs)
  File "/usr/local/lib/python3.9/site-packages/llama_index/agent/openai_agent.py", line 295, in _get_agent_response
    return self._get_stream_ai_response(**llm_chat_kwargs)
  File "/usr/local/lib/python3.9/site-packages/llama_index/agent/openai_agent.py", line 196, in _get_stream_ai_response
    chat_stream_response = StreamingAgentChatResponse(
  File "<string>", line 10, in __init__
  File "/usr/local/lib/python3.9/asyncio/queues.py", line 36, in __init__
    self._loop = events.get_event_loop()
  File "/usr/local/lib/python3.9/asyncio/events.py", line 642, in get_event_loop
    raise RuntimeError('There is no current event loop in thread %r.'
RuntimeError: There is no current event loop in thread 'ScriptRunner.scriptThread'.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant