Mastering LangChain Callbacks: Track LLM Execution Step‑by‑Step

LangChain’s callback system lets developers hook into every stage of an LLM chain— from chain start/end to token generation—using built‑in handlers like StdOutCallbackHandler or custom handlers derived from BaseCallbackHandler, with examples showing constructor‑level and request‑level attachment, plus a custom handler implementation.

BirdNest Tech Talk
BirdNest Tech Talk
BirdNest Tech Talk
Mastering LangChain Callbacks: Track LLM Execution Step‑by‑Step

What Are Callbacks?

In LangChain, a callback is a function you implement that runs when specific events occur during the lifecycle of a chain, LLM call, tool, or retriever. The framework defines events such as on_chain_start, on_chain_end, on_llm_start, on_llm_end, on_tool_start, on_tool_end, on_retriever_start, on_retriever_end, on_llm_new_token, and error events like on_chain_error or on_llm_error.

CallbackHandler

To receive these events you create one or more CallbackHandler classes. Each handler is a class that implements the methods corresponding to the events you care about. LangChain ships with built‑in handlers: StdOutCallbackHandler: prints every event’s details to the console, the quickest way to debug a chain. FileCallbackHandler: writes events to a file.

How to Attach Callbacks

You can attach handlers at two levels:

Constructor‑level : Pass a list of handlers via the callbacks argument when you instantiate a model or chain. llm = ChatOpenAI(callbacks=[MyCustomHandler()]) Request‑level : Provide a config dictionary with a callbacks key when calling .invoke(), .stream(), etc. This overrides any constructor‑level settings and lets different requests use different logic.

chain.invoke({"input": "..."}, config={"callbacks": [MyCustomHandler()]})

This is the most flexible approach.

Creating a Custom Callback Handler

For advanced use‑cases you subclass BaseCallbackHandler and implement the event methods you need. Below is a minimal example that logs the start and end of an LLM call:

from langchain.callbacks.base import BaseCallbackHandler

class MyCustomHandler(BaseCallbackHandler):
    def on_llm_start(self, serialized, prompts, **kwargs):
        print(f"LLM started, prompts: {prompts}")

    def on_llm_end(self, response, **kwargs):
        print(f"LLM finished, output: {response}")

Example Scenarios

Example 1: Using StdOutCallbackHandler to Debug a Chain

This script builds a simple conversation chain (prompt template → ChatOpenAI → string output parser) and attaches StdOutCallbackHandler via the config argument so that every event— on_chain_start, on_llm_start, on_llm_end, on_chain_end —is printed to the console. It demonstrates how callbacks reveal the internal flow and help troubleshoot complex chains.

Example 2: Creating a Custom Callback Handler

The next example shows how to collect specific metrics such as cost and latency by implementing a custom handler that records timestamps and token usage inside the overridden event methods.

Reference Materials

How to: pass in callbacks at runtime [1] – https://python.langchain.com/docs/how_to/callbacks_runtime

How to: attach callbacks to a module [2] – https://python.langchain.com/docs/how_to/callbacks_module

How to: pass callbacks into a module constructor [3] – https://python.langchain.com/docs/how_to/callbacks_constructor

How to: create custom callback handlers [4] – https://python.langchain.com/docs/how_to/custom_callbacks

How to: use callbacks in async environments [5] – https://python.langchain.com/docs/how_to/callbacks_async

How to: dispatch custom callback events [6] – https://python.langchain.com/docs/how_to/callbacks_custom_events

debuggingPythonAILLMLangChainCallbacks
BirdNest Tech Talk
Written by

BirdNest Tech Talk

Author of the rpcx microservice framework, original book author, and chair of Baidu's Go CMC committee.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.