Skip to main content

How to pass run time values to a tool

Prerequisites

This guide assumes familiarity with the following concepts:

Supported models

This how-to guide uses models with native tool calling capability. You can find a list of all models that support tool calling.

Using with LangGraph

If you're using LangGraph, please refer to this how-to guide which shows how to create an agent that keeps track of a given user's favorite pets.

There are times where tools need to use runtime values that should not be populated by the LLM. For example, the tool logic may require using the ID of the user who made the request. In this case, allowing the LLM to control the parameter is a security risk.

Instead, the LLM should only control the parameters of the tool that are meant to be controlled by the LLM, while other parameters (such as user ID) should be fixed by the application logic. These defined parameters should not be part of the tool's final schema.

This how-to guide shows some design patterns that create the tool dynamically at run time and binds appropriate values to them.

pip install -qU langchain-openai
import getpass
import os

os.environ["OPENAI_API_KEY"] = getpass.getpass()

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(model="gpt-3.5-turbo-0125")

Using the curry utility function​

Compatibility

This function is only available in langchain_core>=0.2.17.

We can bind arguments to the tool's inner function via a utility wrapper. This will use a technique called currying to bind arguments to the function while also removing it from the function signature.

Below, we initialize a tool that lists a user's favorite pet. It requires a user_id that we'll curry ahead of time.

from langchain_core.tools import StructuredTool
from langchain_core.utils.curry import curry

user_to_pets = {"eugene": ["cats"]}


def list_favorite_pets(user_id: str) -> None:
"""List favorite pets, if any."""
return user_to_pets.get(user_id, [])


curried_function = curry(list_favorite_pets, user_id="eugene")

curried_tool = StructuredTool.from_function(curried_function)
API Reference:StructuredTool | curry

If we examine the schema of the curried tool, we can see that it no longer has user_id as part of its signature:

curried_tool.input_schema.schema()
{'title': 'list_favorite_petsSchema',
'description': 'List favorite pets, if any.',
'type': 'object',
'properties': {}}

But if we invoke it, we can see that it returns Eugene's favorite pets, cats:

curried_tool.invoke({})
['cats']

Using scope​

We can achieve a similar result by wrapping the tool declarations themselves in a function. This lets us take advantage of the closure created by the wrapper to pass a variable into each tool. Here's an example:

from typing import List

from langchain_core.tools import BaseTool, tool

user_to_pets = {}


def generate_tools_for_user(user_id: str) -> List[BaseTool]:
"""Generate a set of tools that have a user id associated with them."""

@tool
def update_favorite_pets(pets: List[str]) -> None:
"""Add the list of favorite pets."""
user_to_pets[user_id] = pets

@tool
def delete_favorite_pets() -> None:
"""Delete the list of favorite pets."""
if user_id in user_to_pets:
del user_to_pets[user_id]

@tool
def list_favorite_pets() -> None:
"""List favorite pets, if any."""
return user_to_pets.get(user_id, [])

return [update_favorite_pets, delete_favorite_pets, list_favorite_pets]
API Reference:BaseTool | tool

Verify that the tools work correctly:

update_pets, delete_pets, list_pets = generate_tools_for_user("eugene")
update_pets.invoke({"pets": ["cat", "dog"]})
print(user_to_pets)
print(list_pets.invoke({}))
{'eugene': ['cat', 'dog']}
['cat', 'dog']
def handle_run_time_request(user_id: str, query: str):
"""Handle run time request."""
tools = generate_tools_for_user(user_id)
llm_with_tools = llm.bind_tools(tools)
return llm_with_tools.invoke(query)

This code will allow the LLM to invoke the tools, but the LLM is unaware of the fact that a user ID even exists!

ai_message = handle_run_time_request(
"eugene", "my favorite animals are cats and parrots."
)
ai_message.tool_calls
[{'name': 'update_favorite_pets',
'args': {'pets': ['cats', 'parrots']},
'id': 'call_c8agYHY1COFSAgwZR11OGCmQ'}]
info

Chat models only output requests to invoke tools, they don't actually invoke the underlying tools.

To see how to invoke the tools, please refer to how to use a model to call tools.


Was this page helpful?


You can also leave detailed feedback on GitHub.