How to Build an MCP Client‑Server with Spring AI for LLM‑Powered Apps

This article demonstrates how to implement the Model Context Protocol (MCP) using Spring AI, covering the creation of MCP hosts, clients, and servers, configuring dependencies, integrating Claude, adding Brave Search and filesystem tools, and building a functional chatbot that leverages external data sources through standardized LLM interfaces.

Programmer DD
Programmer DD
Programmer DD
How to Build an MCP Client‑Server with Spring AI for LLM‑Powered Apps

Modern web applications are increasingly integrating large language models (LLMs) to create intelligent solutions beyond simple Q&A. To overcome knowledge limits and improve context understanding, developers adopt multi‑source data integration, connecting LLMs with search engines, databases, and file systems. Heterogeneous protocols, however, raise integration complexity.

Anthropic introduced the Model Context Protocol (MCP) to standardize interactions between AI applications and external data sources, providing a unified, extensible framework.

What Is MCP?

MCP follows a client‑server architecture with three key components:

MCP Host : The AI‑enabled application (e.g., Claude client, Cursor) that interacts with the LLM and requests tools.

MCP Client : A component inside the AI application that maintains a one‑to‑one connection to an MCP Server, formatting requests for external resources such as PostgreSQL.

MCP Server : Middleware that exposes external data sources (databases, APIs, file systems) to the LLM.

Create an MCP Host

We use Anthropic’s Claude model as the MCP Host.

Add Required Dependencies

<dependency>
  <groupId>org.springframework.ai</groupId>
  <artifactId>spring-ai-anthropic-spring-boot-starter</artifactId>
  <version>1.0.0-M6</version>
</dependency>
<dependency>
  <groupId>org.springframework.ai</groupId>
  <artifactId>spring-ai-mcp-client-spring-boot-starter</artifactId>
  <version>1.0.0-M6</version>
</dependency>

<repositories>
  <repository>
    <id>spring-milestones</id>
    <name>Spring Milestones</name>
    <url>https://repo.spring.io/milestone</url>
    <snapshots><enabled>false</enabled></snapshots>
  </repository>
</repositories>

Key points:

Use spring-ai-anthropic-spring-boot-starter for Claude; other models can use alternative starters.

Include spring-ai-mcp-client-spring-boot-starter to enable the MCP client.

Add the Spring Milestones repository because MCP is a milestone release.

Configure Model Credentials

spring:
  ai:
    anthropic:
      api-key: ${ANTHROPIC_API_KEY}
      chat:
        options:
          model: claude-3-7-sonnet-20250219

The placeholders load the API key from environment variables, and the model can be swapped as needed.

Configure MCP Clients for Brave Search and Filesystem

Brave Search client (stdio transport):

spring:
  ai:
    mcp:
      client:
        stdio:
          connections:
            brave-search:
              command: npx
              args:
                - "-y"
                - "@modelcontextprotocol/server-brave-search"
              env:
                BRAVE_API_KEY: ${BRAVE_API_KEY}

Filesystem client (stdio transport):

spring:
  ai:
    mcp:
      client:
        stdio:
          connections:
            filesystem:
              command: npx
              args:
                - "-y"
                - "@modelcontextprotocol/server-filesystem"
                - "./"

These configurations let the chatbot perform web searches and file‑system operations.

Build a Simple Chatbot

Create a ChatClient bean that registers the default tools provided by the MCP client:

@Bean
ChatClient chatClient(ChatModel chatModel, SyncMcpToolCallbackProvider toolCallbackProvider) {
    return ChatClient.builder(chatModel)
        .defaultTools(toolCallbackProvider.getToolCallbacks())
        .build();
}

Service method to forward user questions to the LLM:

String chat(String question) {
    return chatClient.prompt()
        .user(question)
        .call()
        .content();
}

Expose a REST endpoint:

@PostMapping("/chat")
ResponseEntity<ChatResponse> chat(@RequestBody ChatRequest chatRequest) {
    String answer = chatbotService.chat(chatRequest.question());
    return ResponseEntity.ok(new ChatResponse(answer));
}

record ChatRequest(String question) {}
record ChatResponse(String answer) {}

Build an MCP Server

Add Server Dependency

<dependency>
  <groupId>org.springframework.ai</groupId>
  <artifactId>spring-ai-mcp-server-webmvc-spring-boot-starter</artifactId>
  <version>1.0.0-M6</version>
</dependency>

Define Custom Tools

Example AuthorRepository exposing two tools via @Tool annotations:

class AuthorRepository {
    @Tool(description = "Get Baeldung author details using an article title")
    Author getAuthorByArticleTitle(String articleTitle) {
        return new Author("John Doe", "[email protected]");
    }

    @Tool(description = "Get highest rated Baeldung authors")
    List<Author> getTopAuthors() {
        return List.of(
            new Author("John Doe", "[email protected]"),
            new Author("Jane Doe", "[email protected]")
        );
    }

    record Author(String name, String email) {}
}

Register the tools:

@Bean
ToolCallbackProvider authorTools() {
    return MethodToolCallbackProvider.builder()
        .toolObjects(new AuthorRepository())
        .build();
}

Configure MCP Client for the Custom Server

spring:
  ai:
    mcp:
      client:
        sse:
          connections:
            author-tools-server:
              url: http://localhost:8081

This client uses Server‑Sent Events (SSE) to communicate with the custom MCP server.

Test the Chatbot

Query the chatbot via HTTPie:

http POST :8080/chat question="How much was Elon Musk's initial offer to buy OpenAI in 2025?"

Sample response:

{
  "answer": "Elon Musk's initial offer to buy OpenAI was $97.4 billion. [Source](https://www.reuters.com/technology/openai-board-rejects-musks-974-billion-offer-2025-02-14/)."
}

File‑system operation example:

http POST :8080/chat question="Create a text file named 'mcp-demo.txt' with content 'This is awesome!'."
{
  "answer": "The text file named 'mcp-demo.txt' has been successfully created with the content you specified."
}

Custom tool usage example:

http POST :8080/chat question="Who wrote the article 'Testing CORS in Spring Boot?' on Baeldung, and how can I contact them?"
{
  "answer": "The article 'Testing CORS in Spring Boot' on Baeldung was written by John Doe. You can contact him via email at [email protected]."
}

Conclusion

This tutorial explored the Model Context Protocol and demonstrated how to implement its client‑server architecture with Spring AI. While the client side is straightforward, the real challenge lies in the server’s ability to expose useful capabilities, often requiring external APIs that MCP can invoke.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

LLMModel Context Protocolspring-bootai-integration
Programmer DD
Written by

Programmer DD

A tinkering programmer and author of "Spring Cloud Microservices in Action"

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.