-
Notifications
You must be signed in to change notification settings - Fork 5.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Function tool callback #16637
base: main
Are you sure you want to change the base?
Function tool callback #16637
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
"""Async Call.""" | ||
tool_output = self._fn(*args, **kwargs) | ||
final_output_content = str(tool_output) | ||
callback_output = self._run_callback(tool_output) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably the callback should be async? Otherwise this will block the event loop (probably not ideal)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In my view, yes. We developed this feature so that when a FunctionTool is called, it can request user input that will influence the result or execution of that function.
That said, it makes sense for it not to be asynchronous. However, in our case, we use synchronous calls. If you believe it should be handled asynchronously, I can change it without any problems.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think in a lot of use cases, people are using something like fastapi to serve APIs, and you wouldn't want this callback to halt the entire server.
It probably makes sense to let the user provide either a sync or async callback, and llama-index handles converting it either way (if a sync function is provided, we can make it a "fake" async function with a wrapper. If an async function is provided, we can make it sync using from llama_index.core.utils import asyncio_run
and using callback_output = asyncio.run(async_fn(tool_output))
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Stopping to evaluate this point, the person who uses an asynchronous call in this case really doesn't make sense conceptually to stop the application.
So whoever uses the asynchronous callback in the asynchronous call will be in a different situation.
I will make the adjustment
Description
This is a feature that allows applying some human-in-the-loop concepts in FunctionTool.
Basically, a callback function is added that enables the developer to request user input in the middle of an agent interaction, as well as allowing any programmatic action.
New Package?
Did I fill in the
tool.llamahub
section in thepyproject.toml
and provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.toml
file of the package I am updating? (Except for thellama-index-core
package)Type of Change
Please delete options that are not relevant.
How Has This Been Tested?
Your pull-request will likely not be merged unless it is covered by some form of impactful unit testing.
Suggested Checklist:
make format; make lint
to appease the lint gods