15 April 2025

LangChain-Tools

by Carson Kempf

LangChain Tools


Overview

Tools are implemented as python functions

They contain a schema that defines

  • The function’s name
  • Description
  • Expected arguments

Chat models support tool calling

  • Chat models request the execution of specific functions
  • They ask for specific inputs

Creating a Tool

  • Uses a @tool decorator
  • A tool returns artifacts

Hides input arguments

  • From the schema
  • From the chat model
  • Using injected tool arguments

The Tool Interface

Runnables

  • The tool interface is defined in the BaseTool class
  • BaseTools is a subclass of Runnables

Schema Attributes

  • The name of the tool
  • A description of what the tool does
  • A property that returns JSON data for the tool’s arguments

Key Methods

  • Invoking the tool with the given arguments
  • AInvoking the tool with the given arguments, asynchronously

Creating tools with the @tool decorator

  • Creates a tool interface
  • Example
from langchain_core.tools import tool

@tool
def multiply(a: int, b: int) -> int:
   """Multiply two numbers."""
   return a * b

How to create custom tools (come back to this)


Using the Tool

multiply.invoke({"a": 2, "b": 3})

Inspecting the Tool’s Arguments

  • JSON response
print(multiply.name) # multiply
print(multiply.description) # Multiply two numbers.
print(multiply.args) 
# {
# 'type': 'object', 
# 'properties': {'a': {'type': 'integer'}, 'b': {'type': 'integer'}}, 
# 'required': ['a', 'b']
# }

InjectedToolArg

  • Arguments should be passed to a tool at runtime
  • Not generated by the model itself

Inject the user_id at runtime

from langchain_core.tools import tool, InjectedToolArg

@tool
def user_specific_tool(input_data: str, user_id: InjectedToolArg) -> str:
    """Tool that processes input data."""
    return f"User {user_id} processed {input_data}"

RunnableConfig

  • Object that passes custom run time values to tools
  • Accessing the object within a tool

from langchain_core.runnables import RunnableConfig

@tool
async def some_func(..., config: RunnableConfig) -> ...:
    """Tool that does something."""
    # do something with config
    ...

await some_func.ainvoke(..., config={"configurable": {"value": "some_value"}})
  • Injected at runtime
  • Not a part of the tool’s schema

Toolkits

  • Groups tools together
  • They are designed to be used together for specific tasks
# Initialize a toolkit
toolkit = ExampleTookit(...)

# Get list of tools
tools = toolkit.get_tools()
  • Returns a list of tools

Tool Calling


Overview

  • When we want a chat model to interact with non-human agents
  • Request a model response that matches a particular schema

Key Concepts

Tool Creation

The @tool Decorator

  • Creates a tool
  • Associates a python function with the model schema

Tool Binding

  • Connecting to a model that supports tool calling
  • The model becomes aware of the tool and the input schema that the tool needs

Tool Calling

  • The model decides whether to call a tool
  • The model ensures the response fits the tool’s input schema

Tool Execution

  • The tool is then executed using arguments
  • The model provides the arguments

Workflow for tool calling:

# Tool creation
tools = [my_tool]
# Tool binding
model_with_tools = model.bind_tools(tools)
# Tool calling 
response = model_with_tools.invoke(user_input)
  1. Create a tool
  2. Pass the tool to the .bind_tools() method
    • The tool is passed as a list
  3. Call the tool using the model
  • The model is called as usual
  • If the model calls a tool, the model response will contain the tool call arguments
  • The model’s response will then be passed directly to the tool

Tool Creation

from langchain_core.tools import tool

@tool
def multiply(a: int, b: int) -> int:
    """Multiply a and b."""
    return a * b

Tool Binding

DeepSeek Documentation

  • Many model providers support tool calling
  • Deepseek does support tool calling
model_with_tools = model.bind_tools(tools_list)
def multiply(a: int, b: int) -> int:
    """Multiply a and b.

    Args:
        a: first int
        b: second int
    """
    return a * b

llm_with_tools = tool_calling_model.bind_tools([multiply])

Tool Calling

  • A model decides when to use a tool

Example Invocation and Model Response

result = llm_with_tools.invoke("What is 2 multiplied by 3?")


result.tool_calls
{'name': 'multiply', 'args': {'a': 2, 'b': 3}, 'id': 'xxx', 'type': 'tool_call'}

Calling Tools Using ToolNode (LangGraph)

  • How to add multiple tools to a model