How Function Calling Empowers LLMs: A Step‑by‑Step LangChain Guide

This article explains how function (tool) calling lets large language models like GPT or Gemini invoke external APIs, walks through defining tools with LangChain, and demonstrates a complete Python example that fetches real‑time weather data and returns a natural‑language answer.

BirdNest Tech Talk
BirdNest Tech Talk
BirdNest Tech Talk
How Function Calling Empowers LLMs: A Step‑by‑Step LangChain Guide

Function calling (also called tool calling) enables large language models such as GPT or Gemini to invoke external functions, turning pure text generation into interactive agents that can access real‑time information.

What is function calling?

Traditional LLMs only process text and cannot know the current date, query a database, or book a flight. By providing a list of tools—each with a name, description, and typed parameters—the model can decide to call a tool when it improves its answer.

Why it matters

Function calling gives the model "actionability": it can fetch live weather, stock prices, query internal knowledge bases, or enforce structured JSON output, making it suitable for enterprise workflows and reducing reliance on prompt engineering.

Function calling with LangChain

LangChain supplies utilities to declare tools (using Pydantic models or the @tool decorator), bind them to a chat model via .bind_tools() or .bind_functions(), and automatically parse the model's tool_calls field.

Step‑by‑step example: weather tool

Define a Pydantic schema GetWeatherArgs with a location field and decorate the implementation get_weather with @tool(args_schema=GetWeatherArgs).

Bind the tool to a ChatOpenAI model: model_with_tools = model.bind_tools([get_weather]).

Send a user message “What’s the weather in Shanghai?”; the model returns an AIMessage containing a JSON tool_calls entry with name get_weather and argument {"location":"上海"}.

The application extracts the call, invokes get_weather, which returns a JSON string such as

{"location":"上海","temperature":"28°C","condition":"多云"}

, and wraps it in a ToolMessage that is appended to the conversation history.

The updated message list is sent back to the model, which now has the real weather data and generates a natural‑language answer.

The script example_2_tool_chain.py implements these steps, handling missing API keys, printing intermediate messages, and finally printing the model’s final response and a concise summary that the "model → tool → model" loop is the core of building agents that can interact with the external world.

References

LangChain "How to: use a model to call tools"

LangChain "How to: do function/tool calling"

LangChain "How to: get models to return structured output"

LangChain "How to: use chat model to call tools"

LangChain "How to: stream tool calls"

LangChain "How to: force a specific tool call"

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

PythonAI agentsLLMtool integrationLangChainFunction Calling
BirdNest Tech Talk
Written by

BirdNest Tech Talk

Author of the rpcx microservice framework, original book author, and chair of Baidu's Go CMC committee.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.