Integrating LangChain4j with Spring Boot for Fast AI Conversations on Alibaba Baichuan

This guide walks through using the SpringAIAlibaba framework to integrate Alibaba Baichuan with Spring Boot via LangChain4j, explains core concepts, compares LangChain4j to Spring AI and OpenAI, and provides step‑by‑step dependency setup, environment configuration, code examples, and a simple browser test.

The Dominant Programmer
The Dominant Programmer
The Dominant Programmer
Integrating LangChain4j with Spring Boot for Fast AI Conversations on Alibaba Baichuan

Scenario

Integrate the SpringAIAlibaba framework in a Spring Boot project to call Alibaba Baichuan models, then add OpenAI‑compatible streaming chat using LangChain4j.

What is LangChain4j?

LangChain4j is an open‑source Java framework inspired by Python’s LangChain. It provides a standardized API and a library of components (model adapters, prompts, memory, retrieval‑augmented generation, tools, chains, agents) so developers can assemble AI applications without handling low‑level model calls. It supports more than 15 LLM providers (e.g., OpenAI, Tongyi Qianwen, Zhipu) and over 15 vector stores (e.g., Qdrant, Milvus).

Core Concepts and Components

Model : Low‑level API for direct interaction with an LLM.

AI Service : High‑level API; define a Java interface with annotations and the framework generates the implementation.

Prompt : Annotations such as @SystemMessage and @UserMessage specify the system role and user request.

Memory : Annotation @MemoryId creates separate conversation histories per user, enabling multi‑turn dialogue.

RAG (Retrieval‑Augmented Generation) : Annotation @RetrievalAugmentor attaches a private knowledge‑base retriever to an AI service.

Tool (Function Calling) : Annotation @Tool registers a Java method that the LLM can invoke (e.g., weather lookup, database query).

Chains & Agents : Build sequential, parallel, or looped workflows; agents can plan and execute multi‑step tasks autonomously.

Architecture and Module Design

LangChain4j is modular and can be introduced incrementally. langchain4j: Core framework defining top‑level APIs and abstractions. langchain4j-{provider}: Provider‑specific integration (e.g., langchain4j-open-ai, langchain4j-community-dashscope). langchain4j-{provider}-spring-boot-starter: Spring Boot auto‑configuration for the chosen provider.

A typical application follows these layers:

Access Layer : AI Service receives dialogue requests and orchestrates processing.

Tool Layer : Java methods exposed as callable tools.

Memory Layer : ChatMemory manages per‑session context.

Knowledge Layer : ContentRetriever / EmbeddingStore implement RAG.

Configuration Layer : Spring Boot starter assembles required beans.

Comparison: LangChain4j vs. Spring AI

LangChain4j

Design: AI‑first, flexible, feature‑rich.

Maturity: Comprehensive, especially for advanced RAG and agent scenarios.

Learning curve: Moderate; requires familiarity with AI concepts.

Ecosystem integration: Broad support for mainstream LLMs and vector stores.

Best fit: Complex, highly customized AI applications.

Spring AI

Design: Spring ecosystem‑first, seamless Spring Boot integration.

Maturity: Emerging project with a basic feature set, evolving quickly.

Learning curve: Low for Spring developers.

Ecosystem integration: Deeply tied to Spring libraries.

Best fit: Quickly adding AI capabilities to existing Spring projects.

Further Comparison with Direct OpenAI API

LangChain4j offers a full‑stack framework with ready‑made components (document loaders, tools, chains). Spring AI provides an abstraction layer ( ChatClient) that hides provider differences. Using the OpenAI HTTP API directly requires manual request construction, JSON handling, retry logic, and conversation‑state management.

Large projects can combine both: use Spring AI’s ChatClient as a unified model access layer, then augment with LangChain4j’s advanced RAG or agent components.

Implementation Steps

Step 1 – Add Project Dependencies

<parent>
  <groupId>org.springframework.boot</groupId>
  <artifactId>spring-boot-starter-parent</artifactId>
  <version>3.2.5</version>
</parent>
<groupId>com.example</groupId>
<artifactId>spring-langchain4j-bailian</artifactId>
<version>1.0</version>
<properties>
  <java.version>17</java.version>
  <langchain4j.version>1.12.2-beta22</langchain4j.version>
</properties>
<dependencies>
  <dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-web</artifactId>
  </dependency>
  <!-- LangChain4j core Spring Boot starter -->
  <dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-spring-boot-starter</artifactId>
    <version>${langchain4j.version}</version>
  </dependency>
  <!-- DashScope provider (Alibaba Baichuan) -->
  <dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-community-dashscope</artifactId>
    <version>${DASHSCOPE_API_KEY}</version> <!-- API key stored in env var -->
  </dependency>
</dependencies>

Step 2 – Configure Provider

Add the following excerpt to src/main/resources/application.yml:

langchain4j:
  dashscope:
    api-key: ${DASHSCOPE_API_KEY}
    model-name: qwen-max   # model selected from Alibaba Baichuan Model Marketplace
    log-requests: true
    log-responses: true

The environment variable DASHSCOPE_API_KEY holds the API key obtained from the Baichuan platform.

Step 3 – Set Environment Variable on Windows

Open “System Properties” → “Advanced” → “Environment Variables”.

Click “New”, set name DASHSCOPE_API_KEY and paste the API key.

Confirm, then restart any open terminals or IDEs so the variable is loaded.

Step 4 – Write Core Code

LangChain4j automatically creates a ChatModel bean from the configuration. Inject it into a Spring REST controller:

import dev.langchain4j.model.chat.ChatModel;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class ChatController {
    private final ChatModel chatModel;
    public ChatController(ChatModel chatModel) {
        this.chatModel = chatModel;
    }
    @GetMapping("/ai/chat")
    public String chat(@RequestParam(value = "message", defaultValue = "你好") String message) {
        String response = chatModel.chat(message);
        return response;
    }
}

Step 5 – Test with a Browser

Run the Spring Boot application and open the following URL (replace the port if necessary):

http://localhost:885/ai/chat?message=你好,请介绍一下自己

The response from the Baichuan model is displayed in the browser.

Chat response screenshot
Chat response screenshot
Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

JavaRAGAgentspring-bootLangChain4jAI chatAlibaba Baichuan
The Dominant Programmer
Written by

The Dominant Programmer

Resources and tutorials for programmers' advanced learning journey. Advanced tracks in Java, Python, and C#. Blog: https://blog.csdn.net/badao_liumang_qizhi

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.