Prebuilt¶
ToolNode¶
Bases: RunnableCallable
A node that runs the tools requested in the last AIMessage. It can be used either in StateGraph with a "messages" key or in MessageGraph. If multiple tool calls are requested, they will be run in parallel. The output will be a list of ToolMessages, one for each tool call.
Source code in langgraph/prebuilt/tool_node.py
ToolExecutor¶
Bases: RunnableCallable
Executes a tool invocation.
Parameters:
-
tools(Sequence[BaseTool]) –A sequence of tools that can be invoked.
-
invalid_tool_msg_template(str, default:INVALID_TOOL_MSG_TEMPLATE) –The template for the error message when an invalid tool is requested. Defaults to INVALID_TOOL_MSG_TEMPLATE.
Examples:
from langchain_core.tools import tool
from langgraph.prebuilt.tool_executor import ToolExecutor, ToolInvocation
@tool
def search(query: str) -> str:
"""Search engine."""
return f"Searching for: {query}"
tools = [search]
executor = ToolExecutor(tools)
invocation = ToolInvocation(tool="search", tool_input="What is the capital of France?")
result = executor.invoke(invocation)
print(result) # Output: "Searching for: What is the capital of France?"
invocation = ToolInvocation(
tool="nonexistent", tool_input="What is the capital of France?"
)
result = executor.invoke(invocation)
print(result) # Output: "nonexistent is not a valid tool, try one of [search]."
Source code in langgraph/prebuilt/tool_executor.py
ToolInvocation¶
Bases: Serializable
Information about how to invoke a tool.
Attributes:
-
tool(str) –The name of the Tool to execute.
-
tool_input(Union[str, dict]) –The input to pass in to the Tool.
Examples:
invocation = ToolInvocation(
tool="search",
tool_input="What is the capital of France?"
)
Source code in langgraph/prebuilt/tool_executor.py
chat_agent_executor.create_tool_calling_executor¶
tools_condition¶
Use in the conditional_edge to route to the ToolNode if the last message
has tool calls. Otherwise, route to the end.
Parameters:
-
state(Union[list[AnyMessage], dict[str, Any]]) –The state to check for tool calls. Must have a list of messages (MessageGraph) or have the "messages" key (StateGraph).
Returns:
-
Literal['action', '__end__']–Literal["tools", "end"]: The next node to route to.
Examples:
from langchain_anthropic import ChatAnthropic
from langchain_core.tools import tool
from langgraph.graph import MessageGraph
from langgraph.prebuilt import ToolNode, tools_condition
@tool
def divide(a: float, b: float) -> int:
"""Return a / b."""
return a / b
llm = ChatAnthropic(model="claude-3-haiku-20240307")
tools = [divide]
graph_builder = MessageGraph()
graph_builder.add_node("tools", ToolNode(tools))
graph_builder.add_node("chatbot", llm.bind_tools(tools))
graph_builder.add_edge("tools", "chatbot")
graph_builder.add_conditional_edges(
"chatbot",
# highlight-next-line
tools_condition,
{
# If it returns 'action', route to the 'tools' node
"action": "tools",
# If it returns '__end__', route to the end
"__end__": "__end__",
},
)
graph_builder.set_entry_point("chatbot")
graph = graph_builder.compile()
graph.invoke([("user", "What's 329993 divided by 13662?")])