You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Set up agent that uses the search_tool
result = await agent.run("How fast is a cheetah?", deterministic_filter="animals")
With this, the agent's LLM will see that search_tool expects a query parameter that should be a str, and it will populate that parameter. The deterministic_filter will end up propagating through to the kwargs and ensure that the filter is always the same, and not subject to the whims of the LLM.
This kwargs approach is nice, but has downsides. For one, additional work is needed to extract the value from kwargs before it can be used (and potentially requires casting it to the desired type). Without looking at the implementation, it's unclear what kwargs are applicable for the search_tool, or what type they should be.
In my testing the end result is the same, because the value from kwargs takes priority over the LLM's generated value. However this isn't ideal as the LLM potentially wastes tokens on parameters it shouldn't worry about, and creates ambiguity about whether the value of the parameter is deterministic or not.
My idea would be to leverage some of Pydantic's capabilities (maybe combining SkipJSONSchema with Annotated) to denote a parameter as ignored for the purposes of the tool's representation to the LLM.
I thought SemanticKernel had something similar, but the closest thing I was able to find with some quick searching was this which doesn't seem to have been implemented.
Edit: In my testing, the second example actually uses the LLM's generated value if you omit **kwargs: Any from the function definition. I'm not sure why that would be the case.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
With the recent introduction of
kwargsflowing through to tools, it is now possible to have anai_functionthat does this:And call it like so:
With this, the agent's LLM will see that
search_toolexpects aqueryparameter that should be astr, and it will populate that parameter. Thedeterministic_filterwill end up propagating through to thekwargsand ensure that the filter is always the same, and not subject to the whims of the LLM.This
kwargsapproach is nice, but has downsides. For one, additional work is needed to extract the value fromkwargsbefore it can be used (and potentially requires casting it to the desired type). Without looking at the implementation, it's unclear whatkwargsare applicable for thesearch_tool, or what type they should be.An alternative would be:
In my testing the end result is the same, because the value from
kwargstakes priority over the LLM's generated value. However this isn't ideal as the LLM potentially wastes tokens on parameters it shouldn't worry about, and creates ambiguity about whether the value of the parameter is deterministic or not.My idea would be to leverage some of Pydantic's capabilities (maybe combining SkipJSONSchema with
Annotated) to denote a parameter as ignored for the purposes of the tool's representation to the LLM.I thought SemanticKernel had something similar, but the closest thing I was able to find with some quick searching was this which doesn't seem to have been implemented.
Edit: In my testing, the second example actually uses the LLM's generated value if you omit
**kwargs: Anyfrom the function definition. I'm not sure why that would be the case.Beta Was this translation helpful? Give feedback.
All reactions