Integrate DeepSeek with Spring AI: Step‑by‑Step Spring Boot Guide
This tutorial walks you through integrating DeepSeek via Spring AI into a Spring Boot project, covering Spring AI basics, obtaining an API key, adding dependencies and configuration, implementing controller endpoints, testing with Postman, and accessing the full source code.
Spring AI Overview
Spring AI is an open‑source framework from the Spring ecosystem that simplifies AI integration for Java developers through abstracted, modular APIs, supporting major AI services such as OpenAI, DeepSeek, Google, and Ollama.
Unified abstraction API for various AI providers.
Core modules for model interaction, vector handling, retrieval‑augmented generation (RAG), and function calling.
Low‑code integration via Spring Boot Starter.
Structured output mapping model responses to Java objects.
Flux‑based streaming responses for real‑time chat.
Obtaining an API Key
Use Alibaba Cloud Bailei’s DeepSeek service. After locating the DeepSeek model at
https://bailian.console.aliyun.com, click “立即体验” and then the key icon to generate your API key.
Usage
Add the Spring AI dependency to your Spring Boot project:
<code><dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-openai-spring-boot-starter</artifactId>
<version>1.0.0-M6</version>
</dependency></code>Configure the AI service in
application.yml:
<code>spring:
ai:
openai:
# API key for the AI service
api-key: <YOUR_API_KEY>
# Base URL for Alibaba Cloud Bailei
base-url: https://dashscope.aliyuncs.com/compatible-mode
chat:
options:
# Model selection (e.g., deepseek-r1 or deepseek-v3)
model: deepseek-r1
# Controls randomness of generation (lower = more deterministic)
temperature: 0.8</code>Create a controller to expose two endpoints – one for direct answers and another for streaming responses:
<code>/**
* @author macrozheng
* @description Controller for DeepSeek integration
*/
@RestController
public class DeepSeekController {
private final OpenAiChatModel chatModel;
@Autowired
public DeepSeekController(OpenAiChatModel chatModel) {
this.chatModel = chatModel;
}
/** Return a direct answer */
@GetMapping("/ai/chat")
public Map chat(@RequestParam(value = "message") String message) {
return Map.of("generation", this.chatModel.call(message));
}
/** Return a streaming answer */
@GetMapping(value = "/ai/chatFlux", produces = MediaType.TEXT_EVENT_STREAM_VALUE + "; charset=UTF-8")
public Flux<ChatResponse> chatFlux(@RequestParam(value = "message") String message) {
Prompt prompt = new Prompt(new UserMessage(message));
return this.chatModel.stream(prompt);
}
}
</code>Testing
Start the application and use Postman to call the endpoints:
Direct answer:
http://localhost:8080/ai/chat?message=YourQuestionStreaming answer:
http://localhost:8080/ai/chatFlux?message=YourQuestionThe complete response text can be retrieved from the
result.output.textproperty.
Conclusion
Integrating DeepSeek with Spring AI is straightforward. For faster responses, prefer the streaming endpoint and concatenate the streamed text.
Source Code
https://github.com/macrozheng/spring-examples/tree/master/spring-deepseek
References
Spring AI documentation: https://docs.spring.io/spring-ai/reference/api/chat/deepseek-chat.html
Alibaba Cloud Bailei DeepSeek integration guide: https://help.aliyun.com/zh/model-studio/developer-reference/deepseek
macrozheng
Dedicated to Java tech sharing and dissecting top open-source projects. Topics include Spring Boot, Spring Cloud, Docker, Kubernetes and more. Author’s GitHub project “mall” has 50K+ stars.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.