Building a Gitee AI Repository Assistant with MCP and LangChain4j

This article explains the Model Context Protocol (MCP) introduced by Gitee, shows how Java developers can integrate it using LangChain4j, compares stdio and SSE transport modes, provides full code samples, installation steps, and demonstrates a practical AI‑powered repository assistant.

Architect
Architect
Architect
Building a Gitee AI Repository Assistant with MCP and LangChain4j

Background

With the rapid development of artificial‑intelligence technology, development tools are evolving. Gitee, a leading Chinese code‑hosting platform, launched the Model Context Protocol (MCP) server, enabling AI assistants to directly access repositories for Issue management, Pull‑Request review, and other code operations.

What is MCP?

MCP is a standard protocol that allows AI models to interact with external tools and services. Through MCP, an AI model can read repository contents, view commit history, create repositories, commit code, and manage Issues and Pull Requests, achieving true automation rather than merely providing suggestions.

Java Ecosystem Implementation

LangChain4j Overview

LangChain4j is a Java library that simplifies integration with large language models (LLM) and supports connecting to MCP servers.

MCP Java Client Construction

Add the following Maven dependencies to a Java project:

<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-mcp</artifactId>
    <version>1.0.0-beta2</version>
</dependency>
<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-open-ai-spring-boot-starter</artifactId>
    <version>1.0.0-beta2</version>
</dependency>

Configure the AI Model

In application.yml configure the OpenAI (or DeepSeek) model:

langchain4j:
  open-ai:
    chat-model:
      api-key: sk-******
      base-url: https://api.deepseek.com/v1
      model-name: deepseek-chat
      log-requests: true

Transport Modes

(1) stdio Transport

Uses local standard input/output streams, suitable for local development and testing.

Implementation example:

@Autowired
private ChatLanguageModel chatLanguageModel;

@SneakyThrows
@Test
void contextLoads() {
    McpTransport transport = new StdioMcpTransport.Builder()
            .command(List.of("/path/to/mcp-gitee", "-token", "GITEE-TOKEN"))
            .logEvents(true)
            .build();

    @Cleanup McpClient mcpClient = new DefaultMcpClient.Builder()
            .transport(transport)
            .build();

    ToolProvider toolProvider = McpToolProvider.builder()
            .mcpClients(List.of(mcpClient))
            .build();

    GiteeAiService giteeAiService = AiServices.builder(GiteeAiService.class)
            .chatLanguageModel(chatLanguageModel)
            .toolProvider(toolProvider)
            .build();

    String result = giteeAiService.chat("获取 log4j/pig 开启的 issue 列表");
    log.info("gitee mcp result: {}", result);
}

(2) SSE Transport

Uses an HTTP connection where the server pushes events, suitable for distributed or multi‑client scenarios.

Server start command: mcp-gitee -transport sse -token GITEE-TOKEN Client implementation example:

@Autowired
private ChatLanguageModel chatLanguageModel;

@SneakyThrows
@Test
void contextLoads() {
    McpTransport sseTransport = new HttpMcpTransport.Builder()
            .sseUrl("http://localhost:8000/sse")
            .logRequests(true)
            .logResponses(true)
            .build();

    @Cleanup McpClient mcpClient = new DefaultMcpClient.Builder()
            .transport(sseTransport)
            .build();

    ToolProvider toolProvider = McpToolProvider.builder()
            .mcpClients(List.of(mcpClient))
            .build();

    GiteeAiService giteeAiService = AiServices.builder(GiteeAiService.class)
            .chatLanguageModel(chatLanguageModel)
            .toolProvider(toolProvider)
            .build();

    String result = giteeAiService.chat("获取 log4j/pig 开启的 issue 列表");
    log.info("gitee mcp result: {}", result);
}

Mode Comparison

Deployment: stdio runs as a local subprocess; SSE runs as an independent server process.

Use case: stdio for local development; SSE for distributed deployment and multiple clients.

Configuration complexity: stdio is simple; SSE is more complex.

Multi‑client support: stdio does not support; SSE does.

Network requirement: stdio works offline; SSE requires network connectivity.

Practical Application: Gitee AI Repository Assistant

The AI assistant, powered by MCP, can:

Read and understand repository Issues.

Automatically review Pull Request code changes.

Monitor repository status.

Perform code‑management actions such as creating branches, committing code, and merging PRs.

Example query result:

目前 log4j/pig 仓库中有以下开启的 issue:

1. JDK17 版本中 oauth2.0 的授权码模式,无法通过 code 获取到 access_token
   - 编号: IBQJ94
   - 创建时间: 2025-03-04T13:04:53+08:00

Installation

Binary download – obtain the executable from the repository’s release page.

Source compilation – run:

git clone https://gitee.com/oschina/mcp-gitee.git
cd mcp-gitee
make build

Go install – for Go 1.23+:

# Install Go 1.23+
# Install mcp-gitee
go install gitee.com/oschina/mcp-gitee@latest

Conclusion and Outlook

Deep integration of Java with MCP enables the creation of a powerful Gitee repository assistant, bringing intelligent and automated code management to developers. MCP greatly expands the possibilities of AI in software development and is poised to become a foundational infrastructure for enterprise‑level AI applications. As the technology matures, collaboration between developers and AI will become tighter, making software development more efficient and intelligent.

JavaAIMCPGiteeLangchain4jCode Automation
Architect
Written by

Architect

Professional architect sharing high‑quality architecture insights. Topics include high‑availability, high‑performance, high‑stability architectures, big data, machine learning, Java, system and distributed architecture, AI, and practical large‑scale architecture case studies. Open to ideas‑driven architects who enjoy sharing and learning.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.