Ch 4 — Tools & Function Calling

How LLMs invoke external functions, APIs, and databases — the bridge between reasoning and action
High Level
-
Click play or press Space to begin...
Step- / 8
AWhy Tools?LLMs can think — tools let them act
1
psychology
LLM AloneCan reason but
can't act on the world
+ tools
build
LLM + ToolsCan search, call APIs,
query DBs, run code
=
smart_toy
AgentReasons about what
tool to use and when
2
arrow_downward How function calling works at a high level
BThe Function Calling FlowLLM decides → you execute → LLM sees result
chat
User Message"What's the weather
in Tokyo?"
+ tools
psychology
LLM DecidesReturns tool_call:
get_weather("Tokyo")
execute
api
You Run ItCall the actual
weather API
result
reply
LLM Responds"It's 22°C and
sunny in Tokyo"
3
arrow_downward Defining tools with the @tool decorator
CDefining ToolsThe @tool decorator turns any function into a tool
code
Python Functiondef get_weather
(city: str) -> str
@tool
build
LangChain ToolName, description,
args schema
bind
smart_toy
Model + Toolsmodel.bind_tools
([get_weather])
4
arrow_downward The message protocol: AIMessage → ToolMessage
DThe Tool Message ProtocolHow tool calls and results flow as messages
psychology
AIMessage.tool_calls = [{
name, args, id}]
execute
build
Run Functionget_weather("Tokyo")
→ "22°C, sunny"
wrap
5
mail
ToolMessagecontent="22°C, sunny"
tool_call_id=matching
6
arrow_downward Multiple tools and parallel calling
EMultiple Tools & Parallel CallsLLMs can call several tools at once
cloud
get_weatherWeather API
for any city
+
search
web_searchSearch the web
for information
+
calculate
calculatorMath operations
and conversions
parallel
dynamic_feed
Parallel CallsLLM can call 2+
tools in one turn
7
arrow_downward MCP: the standard protocol for tool access
FMCP — Model Context ProtocolThe USB-C of AI tools: one standard, any tool
smart_toy
Any AgentLangChain, CrewAI,
any framework
MCP
hub
MCP ServerStandardized
tool interface
exposes
8
database
Any ResourceFiles, DBs, APIs,
GitHub, Slack...
1
Detail