Build a Cultural Name‑Generator with LangChain, Custom Prompts, and Output Parsers
This tutorial walks through installing LangChain, creating an LLM (via own GPU resources or third‑party APIs), designing parameterized prompt templates, implementing a custom output parser for structured results, and running a complete Python example that generates culturally specific names.
Installation
Install the required LangChain version:
pip install --upgrade langchain==0.0.279 -i https://pypi.org/simple1. Create an LLM
Use your own compute with an open‑source large model (requires substantial GPU resources) and your own training data.
Or call a third‑party LLM API such as OpenAI, Baidu Wenxin, or Alibaba Tongyi; data preparation is optional.
Example uses OpenAI’s gpt-3.5‑turbo‑instruct model.
2. Custom Prompt Template
Parameterize the prompt so the same template can generate names for different cultures.
Support passing variables (e.g., {county}, {boy}, {girl}).
Sample template:
"You are a naming master, generate three {county} names, e.g., boy name {boy}, girl name {girl}."Usage
Import the template class:
from langchain.prompts import PromptTemplate3. Output Parser
Convert the LLM’s raw text into a structured format such as a JSON array or a comma‑separated list.
Implement a subclass of BaseOutputParser that splits the result on commas.
from langchain.schema import BaseOutputParser
class CommaSeparatedListOutputParser(BaseOutputParser):
"""Parse the output of an LLM call to a comma‑separated list."""
def parse(self, text: str):
return text.strip().split(", ")
# Example
print(CommaSeparatedListOutputParser().parse("hi, bye"))4. Full Working Example
Set environment variables for the OpenAI key and optional proxy:
import os
os.environ["OPENAI_KEY"] = "xxxxx"
os.environ["OPENAI_API_BASE"] = "xxxxx" # if a proxy is neededLoad the key in Python:
import os
openai_api_key = os.getenv("OPENAI_KEY")
openai_api_base = os.getenv("OPENAI_API_BASE")
print("OPENAI_API_KEY:", openai_api_key)
print("OPENAI_PROXY:", openai_api_base)Instantiate the LLM and the prompt:
from langchain.llms import OpenAI
from langchain.prompts import PromptTemplate
llm = OpenAI(
model="gpt-3.5-turbo-instruct",
temperature=0,
openai_api_key=openai_api_key,
openai_api_base=openai_api_base,
)
prompt = PromptTemplate.from_template(
"You are a naming master, generate three {county} names. Example: boy {boy}, girl {girl}. Return a comma‑separated list only."
)
message = prompt.format(county="Chinese", boy="DogEgg", girl="CuiHua")
print(message)
raw_output = llm.predict(message)
print(raw_output)
parser = CommaSeparatedListOutputParser()
names = parser.parse(raw_output)
print(names)The script prints a list such as ['Jack', 'Michael', 'Jason'], demonstrating how to obtain structured name data.
5. Verify Installation
!pip show langchain
!pip show openaiImages in the original article illustrate the workflow and can be referenced for visual guidance.
JavaEdge
First‑line development experience at multiple leading tech firms; now a software architect at a Shanghai state‑owned enterprise and founder of Programming Yanxuan. Nearly 300k followers online; expertise in distributed system design, AIGC application development, and quantitative finance investing.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
