How to Build a LangChain4j MCP Tool Provider with Docker and GitHub Integration
This tutorial explains how to use LangChain4j's Model Context Protocol (MCP) to create a tool provider, configure HTTP or stdio transports, run a GitHub MCP server in Docker, and employ a language model to summarize recent repository commits, complete with code samples and logging customization.
Introduction
LangChain4j implements the Model Context Protocol (MCP) which enables a client to communicate with MCP‑compliant servers and invoke server‑side tools. Two transport modes are supported:
HTTP mode – the client receives server‑sent events (SSE) on a dedicated channel and sends commands via HTTP POST.
Stdio mode – the client starts the MCP server as a local subprocess and exchanges messages through standard input/output.
To let a chat model or other AI service use tools provided by an MCP server, you first create an MCP tool provider instance.
Create MCP Tool Provider
Transport configuration
Stdio transport example (using an NPM package) :
McpTransport transport = new StdioMcpTransport.Builder()
.command(List.of("/usr/bin/npm", "exec", "@modelcontextprotocol/[email protected]"))
.logEvents(true) // optional logging
.build();HTTP transport example (requires an SSE URL and a POST URL):
McpTransport transport = new HttpMcpTransport.Builder()
.sseUrl("http://localhost:3001/sse") // SSE channel address
.logRequests(true)
.logResponses(true)
.build();Create MCP client
The client is built from the chosen transport and handles the low‑level MCP communication.
McpClient mcpClient = new DefaultMcpClient.Builder()
.transport(transport)
.build();Build the tool provider
A tool provider aggregates one or more MCP clients. Failure handling can be tuned with builder.failIfOneServerFails(boolean) (default false – ignore a single server failure).
ToolProvider toolProvider = McpToolProvider.builder()
.mcpClients(List.of(mcpClient))
.build();Bind the provider to an AI service, for example a LangChain4j bot:
Bot bot = AiServices.builder(Bot.class)
.chatModel(model)
.toolProvider(toolProvider)
.build();Logging
MCP servers can push log messages to the client. By default they are routed to SLF4J. To use a custom handler, implement dev.langchain4j.mcp.client.logging.McpLogMessageHandler and supply it to the client builder:
McpClient mcpClient = new DefaultMcpClient.Builder()
.transport(transport)
.logMessageHandler(new MyLogMessageHandler())
.build();Resource Operations
client.listResources()– returns a list of McpResource objects containing metadata and a URI. client.listResourceTemplates() – fetches available resource templates. client.readResource(uri) – reads a specific resource. The result is an McpReadResourceResult that holds either McpBlobResourceContents (binary) or McpTextResourceContents (text).
Prompt Operations
client.listPrompts()– returns a list of McpPrompt definitions. client.getPrompt(name, arguments) – renders a prompt and returns an McpPromptMessage (role and content).
Supported content types are McpTextContent, McpImageContent and McpEmbeddedResource. An McpPromptMessage can be converted to a LangChain4j ChatMessage via McpPromptMessage.toChatMessage(), but only assistant‑role messages with plain text are allowed; binary content cannot be converted.
Run GitHub MCP Server with Docker
Clone the MCP GitHub server repository and build a Docker image: docker build -t mcp/github -f Dockerfile . After the build the image mcp/github is available locally:
docker image ls
REPOSITORY TAG IMAGE ID SIZE
mcp/github latest b141704170b1 173MBTool Provider Code Example
The following Java class demonstrates a complete workflow:
Start the Docker container that runs the GitHub MCP server.
Connect to the server using stdio transport.
Use an Ollama language model to ask the server to summarize the last three commits of the LangChain4j GitHub repository.
Run the container (replace token with a personal access token if needed):
docker run --rm -d \
--name mcp-github-server \
-e GITHUB_PERSONAL_ACCESS_TOKEN=token \
mcp/githubJava implementation:
public static void main(String[] args) throws Exception {
// 1. Language model (Ollama) configuration
ChatLanguageModel model = OllamaChatModel.builder()
.baseUrl("http://localhost:11434")
.modelName("llama3-groq-tool-use:8b")
.logRequests(true)
.logResponses(true)
.build();
// 2. Stdio transport that launches the Docker container
McpTransport transport = new StdioMcpTransport.Builder()
.command(List.of("/usr/local/bin/docker", "run", "-e", "GITHUB_PERSONAL_ACCESS_TOKEN", "-i", "mcp/github"))
.logEvents(true)
.build();
// 3. MCP client
McpClient mcpClient = new DefaultMcpClient.Builder()
.transport(transport)
.build();
// 4. Tool provider wrapping the client
ToolProvider toolProvider = McpToolProvider.builder()
.mcpClients(List.of(mcpClient))
.build();
// 5. Bot that can call tools via the provider
Bot bot = AiServices.builder(Bot.class)
.chatModel(model)
.toolProvider(toolProvider)
.build();
try {
String response = bot.chat("Summarize the last 3 commits of the LangChain4j GitHub repository");
System.out.println("RESPONSE: " + response);
} finally {
mcpClient.close();
}
}Execute Example
Running the Java program prints a summary of the three most recent commits, including commit SHA, author, timestamp and a short description. Example output:
以下是 LangChain4j GitHub 仓库最近三次提交的摘要:
1. **提交 36951f9**(时间:2025-02-05)
- 作者:Dmytro Liubarskyi
- 信息:更新至 `upload-pages-artifact@v3`
- 详情:此提交将上传页面资源的 GitHub Action 升级至版本 3。
2. **提交 6fcd19f**(时间:2025-02-05)
- 作者:Dmytro Liubarskyi
- 信息:更新至 `checkout@v4`、`deploy-pages@v4` 和 `upload-pages-artifact@v4`
- 详情:此提交升级了多个 GitHub Action 到版本 4。
3. **提交 2e74049**(时间:2025-02-05)
- 作者:Dmytro Liubarskyi
- 信息:更新至 `setup-node@v4` 和 `configure-pages@v4`
- 详情:此提交将相关 GitHub Action 升级至新版。JavaEdge
First‑line development experience at multiple leading tech firms; now a software architect at a Shanghai state‑owned enterprise and founder of Programming Yanxuan. Nearly 300k followers online; expertise in distributed system design, AIGC application development, and quantitative finance investing.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
