Master LangChain Chains with LCEL: From Simple Jokes to RAG and Agent Pipelines
This guide explains how LangChain’s Expression Language (LCEL) lets you declaratively connect prompts, models, and output parsers into chains, walks through environment setup, dependency installation, and detailed code examples ranging from a basic joke generator to retrieval‑augmented generation and memory‑enabled agents.
In LangChain, a Chain links components such as language models, prompts, and tools in sequence to accomplish a task. By chaining simple components, complex applications can be built.
LCEL: LangChain Expression Language
LCEL is the declarative method introduced in LangChain v0.1.0. It uses the pipe symbol | to connect components, making data flow explicit and providing built‑in support for streaming, parallel execution, and logging.
A minimal chain consists of three parts:
Prompt : receives user input and formats it into a model‑understandable prompt.
Model : consumes the formatted prompt and generates a response.
Output Parser : transforms the model’s textual response into a usable structure (e.g., JSON or a Python object).
Running the Example Repository
The 22‑Chains/ directory contains several scripts that demonstrate LCEL chains of increasing complexity.
Environment Setup
API keys : create a .env file in 22‑Chains/ and add your keys.
OPENAI_API_KEY="sk-..."
TAVILY_API_KEY="tvly-..."Obtain a free Tavily AI key at https://tavily.com/.
Install dependencies : run the following command once to install all required packages.
pip install langchain langchain-openai langchain-community python-dotenv faiss-cpu tiktoken tavily-pythonExecute an example : run any script from the terminal.
python example_1_simple_lcel_chain.py
python example_2_rag_chain.py
python example_3_chain_with_history.py
python example_1_basic_agent.py
python example_2_agent_with_memory.pyExample File Walk‑through
example_1_simple_lcel_chain.py: builds the simplest LCEL chain—prompt → model → output parser—to generate a joke. example_2_rag_chain.py: constructs a Retrieval‑Augmented Generation (RAG) chain. First it queries a vector store for relevant documents, then feeds those documents to the model so the answer is grounded in retrieved context. example_3_chain_with_history.py: demonstrates a conversational chain with manual memory management. The script appends each turn to a chat history list, allowing the model to retain context across multiple exchanges. example_1_basic_agent.py and example_2_agent_with_memory.py: introduce agents, a dynamic chain type. An agent uses a large language model to decide which tool (e.g., a web search via Tavily) to invoke. The second script adds a memory component so the agent can recall prior interactions.
Note: These examples illustrate the full spectrum of LangChain chaining—from basic LCEL composition to advanced RAG and agent patterns.
Reference
[1] Tavily AI: https://tavily.com/
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
BirdNest Tech Talk
Author of the rpcx microservice framework, original book author, and chair of Baidu's Go CMC committee.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
