Getting Started with Spring AI: Building a Hello‑World Application Using DeepSeek
This tutorial explains what Spring AI is, walks through creating a Spring Boot project with Maven, adding the necessary dependencies, writing a simple controller that forwards user messages to a local DeepSeek model, configuring the application, and testing the AI‑powered endpoint.
Spring AI is a framework that helps Java developers quickly build AI applications, supporting major AI models, cross‑model compatibility, vector databases, model output mapping, and function calls.
The article outlines four parts: an introduction to Spring AI, writing a Spring AI program, testing, and a summary.
What is Spring AI
Spring AI aims to simplify AI application development by reducing complexity and providing features such as support for models from Anthropic, OpenAI, Microsoft, Amazon, Google, and Ollama, cross‑model APIs, integration with vector databases like Cassandra, Elasticsearch, Neo4j, Redis, and SAP Hana, automatic POJO mapping of model outputs, and function calling for external API interaction.
Creating the Spring AI Project
First, create a parent Maven project (pom.xml):
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.3.4</version>
<relativePath/>
</parent>
<groupId>com.myai</groupId>
<artifactId>example</artifactId>
<version>0.0.1‑SNAPSHOT</version>
<packaging>pom</packaging>
<modules>
<module>dm-sample</module>
</modules>
<properties>
<java.version>17</java.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>3.3.4</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<version>3.3.4</version>
</dependency>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-ollama-spring-boot-starter</artifactId>
<version>1.0.0‑SNAPSHOT</version>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>Then create the module dm-sample with its own pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.myai</groupId>
<artifactId>example</artifactId>
<version>0.0.1‑SNAPSHOT</version>
</parent>
<groupId>com.myai</groupId>
<artifactId>dm-sample</artifactId>
<version>1.0‑SNAPSHOT</version>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-ollama-spring-boot-starter</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>Next, add a controller that receives a message, forwards it to the DeepSeek model, and returns the result:
package com.myai.doc.controller;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
@RestController
@RequestMapping("/ai")
public class ChatController {
private final ChatClient chatClient;
public ChatController(ChatClient.Builder chatClient) {
this.chatClient = chatClient.build();
}
@GetMapping("/chat")
public String chat(@RequestParam(value = "message") String message) {
String result;
try {
result = chatClient.prompt().user(message).call().content();
} catch (Exception e) {
return "Exception";
}
return result;
}
}The corresponding application.yaml configures the local Ollama server and selects the DeepSeek model:
spring:
http:
encoding:
charset: UTF-8
enable: true
force: true
ai:
ollama:
base-url: http://localhost:11434
chat:
model: deepseek-r1:1.5bAfter building the project, run the Spring Boot application and call the endpoint:
http://localhost:8080/ai/chat?message=helloThe response will contain the model’s reply, demonstrating a successful Hello‑World integration of Spring AI with DeepSeek.
Summary
Spring AI provides a mature, Spring‑native way for Java developers to integrate large language models; by following the steps above, you can quickly set up a Spring Boot service that talks to a local DeepSeek model, opening many possibilities for AI‑enhanced Java applications.
Full-Stack Internet Architecture
Introducing full-stack Internet architecture technologies centered on Java
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.