-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add Tool.from_component #159
Conversation
Pull Request Test Coverage Report for Build 12653556844Details
💛 - Coveralls |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the effort you are putting into this PR.
I do have some questions and suggestions.
The requirement was to support components with basic str or native Python types as input. This implementation appears to go beyond that, which is great for flexibility but might become hard to maintain.
- What's the advantage of supporting Pydantic models here? Maybe I am just missing some reasonable use cases...
As I commented in the PR, if I remember correctly, one of the initial requirements was to enable the deserialization of Tools from YAML, which would be feasible if Tools are treated as components. Is it possible? If yes, can we add some tests to cover this?
Given the main goal of this PR, would it make sense to involve someone from the DC team in the review?
msg = ( | ||
"Component has been added in a Pipeline and can't be used to create a Tool. " | ||
"Create Tool from a non-pipeline component instead." | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you please explain this?
If I remember correctly, one of the requirements was about deserializing Tools from YAML (which should be feasible if Tools are components). I'm not totally sure...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I thought we can have a component declared but not be part of the pipeline. Maybe not, depending on that we can remove this check.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I still don't understand if this is a self-imposed limitation (I don't think so) or there are strong reasons to avoid that. Could you please explain this point further?
The TypeAdapter from pydantic takes care of range of json to object conversion be these objects strings or other native Python types, dataclasses or Pydantic objects. It is in fact, harder and longer codebase to implement a version of Tool.from_component that only supports strings, native Python types and their lists as we need to detect cases we don't support, raise errors and so on. TypeAdapter.validate_python takes conversion of all of these. Unit and integration tests for dataclasses and Pydantic models I included showcase how this is indeed possible with almost no code. We even support our own Document class as show in test examples. Given these newfound benefits of TypeAdapter which enable proper conversion support with 10 lines of well tested pydantic code - I went for that solution although it was not required by the requirements.
Yes, I need to review this part as well.
Yes, @mathislucka is on PTO and he had the most context. Let's wait for him |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like the simplification you have made.
I left other comments and asked Julian to take a look as well.
from .component_schema import create_tool_parameters_schema | ||
|
||
__all__ = ["create_tool_parameters_schema"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- I would not export this function here if possible - see Importing one component of a certain family/module leads to importing all components of the same family/module haystack#8650
- I would prefer to make this method internal and also all others in component_schema.py. They should not be user-facing and if we make them internal, we are then free to change them at any time if needed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok makes sense, will do 🙏
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tool.from_component functionality is great! I have some feedback about the tests and suggest we add defaults for tool name and tool description so that the from_component requires only a component as input for standard use.
For the tests:
I suggest replacing gpt-4o with gpt-4o-mini if that works. Also consider replacing claude-3-5-sonnet-20240620 with a simpler model if that works?
I understand that the long instructions like "Concatenate these documents: First one says 'Hello world' and second one says 'Goodbye world'. Set only content field of the document only. Do not set id, meta, score, embedding, sparse_embedding, dataframe, blob fields."
are meant to test the implementation. Still, I suggest that we use Document instances as input directly too. For example, have one e2e test with a retriever as a tool from component and another component that takes documents as input as a second tool.
As the test for OpenAI and Anthropic are very similar, let's use parametrization for the tests, which will reduce lines of code. You might need to use pytest.skip()
within the test to skip if api keys are not available then.
Last but not least, we should have a test case that covers loading tools from components defined in a pipeline yaml. That should be a YAML where a ToolInvoker is defined and it's init parameter for tools is a list of components. The ToolInvoker would need to call from_component internally then as we discussed offline.
Create a Tool instance from a Haystack component. | ||
|
||
:param component: The Haystack component to be converted into a Tool. | ||
:param name: Name for the tool. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't want to make this PR more complex but have we considered using defaults for name and description? For example the component name as the default for name
and parts of the component docstring extracted via docstring_parser for description
?
That would also enable the ToolInvoker to call from_component internally if the input to its tool
parameter is a list of components instead of tools, which I think is needed for defining tools as components in a yaml.
Abandoned and superseded by deepset-ai/haystack#8693 |
Why:
Automates the conversion of Haystack components into LLM tools.
Tool.from_component
for components with str/native python types as input haystack#8630What:
component_schema.py
, that converts component run parameters into a JSON tools schema format.tool.py
to introduce afrom_component
method, enabling the creation ofTool
instances from Haystack components. This makes tool creation and integration into pipelines more dynamic.test_tool_component.py
, validating the conversion and functionality of components as tools, including testing various data types and nested structures.How can it be used:
and the use it as a Tool following the established patterns.
How did you test it:
Notes for the reviewer: