Introduction to LangChain: Concepts, Tools, and Example Applications
This article introduces the LangChain framework, explains its core concepts such as models, prompts, agents, memory, indexes, and tools, provides detailed code examples for each component, and demonstrates practical applications ranging from chatbots to image generation, helping readers understand and build powerful LLM-powered solutions.
LangChain is a framework that integrates large language models (LLMs) with external computation and knowledge sources to build more capable AI applications.
Key Concepts
Models: standard interfaces for text generation and chat models.
Prompts: PromptTemplate and ChatPromptTemplate for reusable prompt construction.
Agents: Action and Plan‑and‑Execute agents that select tools based on task requirements.
Memory: ConversationBufferMemory to retain dialogue context.
Indexes: Document loaders, text splitters, vector stores, and retrievers for knowledge‑base retrieval.
Chains: Combine multiple components into reusable pipelines.
Tools: Custom and built‑in tools (e.g., web search, math, image generation) that agents can invoke.
Code Examples
from langchain.schema import HumanMessage
from langchain.llms import OpenAI
from langchain.chat_models import ChatOpenAI
llm = OpenAI()
chat_model = ChatOpenAI()
print(llm("say hi!"))
print(chat_model.predict("say hi!")) from langchain import PromptTemplate
template = """
I want you to act as a naming consultant for new companies.
What is a good name for a company that makes {product}?
"""
prompt = PromptTemplate(
input_variables=["product"],
template=template,
)
prompt.format(product="colorful socks") from langchain.prompts import (
ChatPromptTemplate,
PromptTemplate,
SystemMessagePromptTemplate,
AIMessagePromptTemplate,
HumanMessagePromptTemplate,
)
template = "You are a helpful assistant that translates {input_language} to {output_language}."
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
human_template = "{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])
print(chat_prompt.format_prompt(input_language="English", output_language="French", text="I love programming.").to_messages()) from langchain.prompts.example_selector.base import BaseExampleSelector
from typing import Dict, List
import numpy as np
class CustomExampleSelector(BaseExampleSelector):
def __init__(self, examples: List[Dict[str, str]]):
self.examples = examples
def add_example(self, example: Dict[str, str]) -> None:
self.examples.append(example)
def select_examples(self, input_variables: Dict[str, str]) -> List[dict]:
return np.random.choice(self.examples, size=2, replace=False)
examples = [{"foo": "1"}, {"foo": "2"}, {"foo": "3"}]
example_selector = CustomExampleSelector(examples)
print(example_selector.select_examples({"foo": "foo"}))
example_selector.add_example({"foo": "4"})
print(example_selector.examples)
print(example_selector.select_examples({"foo": "foo"})) from langchain.embeddings import OpenAIEmbeddings
embeddings = OpenAIEmbeddings()
text = "This is a test document."
query_result = embeddings.embed_query(text)
doc_result = embeddings.embed_documents([text])
print(doc_result) from langchain import ConversationChain, OpenAI
from langchain.memory import ConversationBufferMemory
memory = ConversationBufferMemory()
memory.chat_memory.add_user_message("你好!")
memory.chat_memory.add_ai_message("你好吗?")
llm = OpenAI(temperature=0)
chain = ConversationChain(llm=llm, verbose=True, memory=memory)
chain.predict(input="最近怎么样!")
print(chain.predict(input="感觉很不错,刚和AI做了场对话.")) from langchain.chains import LLMChain
from langchain.chat_models import ChatOpenAI
from langchain import PromptTemplate
from langchain.prompts.chat import (
ChatPromptTemplate,
HumanMessagePromptTemplate,
)
human_message_prompt = HumanMessagePromptTemplate(
prompt=PromptTemplate(
template="给我一个制作{product}的好公司名字?",
input_variables=["product"],
)
)
chat_prompt_template = ChatPromptTemplate.from_messages([human_message_prompt])
chat = ChatOpenAI(temperature=0.9)
chain = LLMChain(llm=chat, prompt=chat_prompt_template)
print(chain.run("袜子")) from langchain.agents import load_tools, initialize_agent, AgentType
from langchain.llms import OpenAI
llm = OpenAI(temperature=0)
tools = load_tools(["serpapi", "llm-math"], llm=llm)
agent = initialize_agent(tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True)
agent.run("特朗普今年多少岁? 他的年龄除以2是多少?") import os
from langchain.chains import RetrievalQA
from langchain.document_loaders import TextLoader
from langchain.embeddings import OpenAIEmbeddings
from langchain.indexes import VectorstoreIndexCreator
from langchain.text_splitter import CharacterTextSplitter
from langchain.vectorstores import Chroma
from langchain.llms import OpenAI
os.environ['HTTP_PROXY'] = 'socks5h://127.0.0.1:13659'
os.environ['HTTPS_PROXY'] = 'socks5h://127.0.0.1:13659'
loader = TextLoader('/Users/aihe/Downloads/demo.txt', encoding='utf8')
documents = loader.load()
text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)
texts = text_splitter.split_documents(documents)
embeddings = OpenAIEmbeddings()
db = Chroma.from_documents(texts, embeddings)
retriever = db.as_retriever()
qa = RetrievalQA.from_chain_type(llm=OpenAI(), chain_type="stuff", retriever=retriever)
print(qa.run("如何申请租户?"))
print(qa.run("能否说明下你可以提供的功能?"))Applications
The article showcases several practical use‑cases: building a Q&A system with retrieval, creating image generation pipelines, constructing chatbots with LangFlow visual orchestration, and designing custom tools for specific business logic.
Future Outlook
LangChain is expected to expand into domains such as intelligent customer service, personalized recommendation, knowledge‑graph construction, automated summarization, code review assistants, SEO optimization, data analysis, smart programming assistants, online education, and automated testing.
References include the LangChain Chinese guide, official documentation, and the LangFlow visual tool repository.
DataFunTalk
Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.