Building a Smart Customer Service with Spring Boot, LangChain4j, and Ollama Function Calling

This guide walks through setting up a local LLM with Ollama, configuring Spring Boot and LangChain4j, defining function‑calling tools for weather, order status, logistics and coupons, creating AI service beans, exposing REST controllers, and troubleshooting common integration issues.

The Dominant Programmer
The Dominant Programmer
The Dominant Programmer
Building a Smart Customer Service with Spring Boot, LangChain4j, and Ollama Function Calling

Scenario

Use Spring Boot, LangChain4j, and Ollama to build a local large language model (LLM) that supports Function Calling, and demonstrate a smart customer‑service chatbot.

1. What is Function Calling?

Function Calling lets an LLM request external functions during response generation to obtain real‑time data or perform actions, such as fetching weather or querying a database, improving accuracy and enabling automation.

2. Environment Preparation and Model Selection

2.1 Prerequisites

Ollama models that support Function Calling: qwen2:7b (recommended for Chinese), llama3.1:8b, mistral:7b-v0.3 LangChain4j version >= 1.0.0‑beta4 Spring Boot 3.x, JDK 17

2.2 Configure Ollama Mirror for China

Create %USERPROFILE%\.ollama\config.json (e.g., C:\Users\admin\.ollama\config.json) with the following content:

{
  "registry": {
    "mirrors": {
      "registry.ollama.ai": "https://registry.ollama.ai"
    }
  }
}

Restart Ollama after editing.

2.3 Pull a Supporting Model

ollama pull qwen2:7b

3. Spring Boot Project Dependencies and YAML Changes

Update pom.xml (excerpt):

<parent>
  <groupId>org.springframework.boot</groupId>
  <artifactId>spring-boot-starter-parent</artifactId>
  <version>3.2.5</version>
</parent>

<properties>
  <java.version>17</java.version>
  <langchain4j.version>1.0.0-beta4</langchain4j.version>
</properties>

<dependencyManagement>
  <dependencies>
    <dependency>
      <groupId>dev.langchain4j</groupId>
      <artifactId>langchain4j-bom</artifactId>
      <version>${langchain4j.version}</version>
    </dependency>
  </dependencies>
</dependencyManagement>

Configure application.yml for Ollama:

langchain4j:
  ollama:
    chat-model:
      base-url: http://localhost:11434
      model-name: qwen2:7b
      temperature: 0.7
      timeout: PT120S          # total timeout 120 seconds
      connect-timeout: PT10S   # connection timeout 10 seconds
      read-timeout: PT120S     # read timeout 120 seconds
      log-requests: true
      log-responses: true

4. Defining Tools (using @Tool annotation)

4.1 Weather Query Tool

package com.badao.ai.tool;

import dev.langchain4j.agent.tool.P;
import dev.langchain4j.agent.tool.Tool;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;

@Component
public class WeatherTool {
    @Tool("获取指定城市当前的天气状况")
    public String getWeather(@P("城市名称,例如:Beijing") String city) {
        if ("Beijing".equalsIgnoreCase(city)) {
            return "北京现在是晴天,温度 25°C。";
        }
        return "无法获取 " + city + " 的天气信息。";
    }

    @Tool("获取当前时间")
    public String getCurrentDateTime(@P("你希望了解当前日期和时间") String placeholder) {
        return "当前时间是:" + LocalDateTime.now().toString();
    }
}

4.2 Smart Customer‑Service Tools (order, logistics, coupons)

package com.badao.ai.tool;

import dev.langchain4j.agent.tool.P;
import dev.langchain4j.agent.tool.Tool;
import org.springframework.stereotype.Component;
import java.util.HashMap;
import java.util.Map;

@Component
public class CustomerServiceTools {
    private static final Map<String, String> ORDER_STATUS_DB = new HashMap<>();
    private static final Map<String, String> ORDER_LOGISTICS_DB = new HashMap<>();
    private static final Map<String, String> COUPON_DB = new HashMap<>();
    static {
        ORDER_STATUS_DB.put("123456", "已发货");
        ORDER_STATUS_DB.put("789012", "待支付");
        ORDER_STATUS_DB.put("345678", "已完成");
        ORDER_LOGISTICS_DB.put("123456", "申通快递:777888999,已揽件");
        ORDER_LOGISTICS_DB.put("345678", "顺丰快递:SF123456789,已签收");
        COUPON_DB.put("user_1", "满100减10元优惠券一张");
        COUPON_DB.put("user_2", "无门槛5元优惠券一张");
    }

    @Tool("查询订单状态。根据订单号获取当前状态(已发货、待支付、已完成等)")
    public String getOrderStatus(@P("订单号,例如 123456") String orderId) {
        String status = ORDER_STATUS_DB.getOrDefault(orderId, "未找到该订单");
        return "订单 " + orderId + " 当前状态:" + status;
    }

    @Tool("查询订单物流信息。根据订单号获取物流公司、单号及最新轨迹")
    public String getLogistics(@P("订单号,例如 123456") String orderId) {
        String logistics = ORDER_LOGISTICS_DB.getOrDefault(orderId, "暂无物流信息,可能订单尚未发货");
        return "订单 " + orderId + " 物流信息:" + logistics;
    }

    @Tool("查询用户可用的优惠券。根据用户ID返回优惠券信息")
    public String getCoupon(@P("用户ID,例如 user_1") String userId) {
        String coupon = COUPON_DB.getOrDefault(userId, "您当前没有可用的优惠券");
        return "用户 " + userId + " 的优惠券:" + coupon;
    }
}

5. Creating AI Service Beans

5.1 Define Assistant Interfaces

package com.badao.ai.config;

import dev.langchain4j.service.SystemMessage;

public interface Assistant {
    @SystemMessage("你是一个有帮助的AI助手。你可以使用工具来回答用户关于天气和时间的问题。")
    String chat(String userMessage);
}

public interface CustomerServiceAssistant {
    @SystemMessage("""
        你是一个智能客服助手,可以回答用户关于订单状态、物流信息、优惠券等问题。
        请使用提供的工具来获取准确信息。
        如果用户询问订单状态,请调用 getOrderStatus 工具。
        如果用户询问物流信息,请调用 getLogistics 工具。
        如果用户询问优惠券,请调用 getCoupon 工具。
        回答要简洁、友好,使用中文。
        """)
    String chat(String userMessage);
}

5.2 Manual AiServices Bean Construction (recommended)

package com.badao.ai.config;

import com.badao.ai.tool.CustomerServiceTools;
import com.badao.ai.tool.WeatherTool;
import dev.langchain4j.memory.chat.ChatMemoryProvider;
import dev.langchain4j.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.service.AiServices;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
public class AiConfig {
    @Bean
    public ChatMemoryProvider chatMemoryProvider() {
        return memoryId -> MessageWindowChatMemory.withMaxMessages(10);
    }

    @Bean(name = "assistant")
    public Assistant assistant(ChatModel chatModel, WeatherTool weatherService) {
        return AiServices.builder(Assistant.class)
                .chatModel(chatModel)
                .tools(weatherService)
                .build();
    }

    @Bean(name = "customAssistant")
    public CustomerServiceAssistant customerServiceAssistant(ChatModel chatModel, CustomerServiceTools tools) {
        return AiServices.builder(CustomerServiceAssistant.class)
                .chatModel(chatModel)
                .tools(tools)
                .build();
    }
}

Note: tools() expects tool instances, not class objects. The @SystemMessage annotation on interface methods is automatically recognized.

6. Controllers and Testing

6.1 Weather Controller

package com.badao.ai.controller;

import com.badao.ai.config.Assistant;
import org.springframework.web.bind.annotation.*;

@RestController
public class ChatController {
    private final Assistant aiAssistant;
    public ChatController(Assistant aiAssistant) { this.aiAssistant = aiAssistant; }
    @GetMapping("/ai/assistant")
    public String askAssistant(@RequestParam String message) {
        return aiAssistant.chat(message);
    }
}

6.2 Customer‑Service Controller

package com.badao.ai.controller;

import com.badao.ai.config.CustomerServiceAssistant;
import org.springframework.web.bind.annotation.*;

@RestController
@RequestMapping("/customer")
public class CustomerServiceController {
    private final CustomerServiceAssistant assistant;
    public CustomerServiceController(CustomerServiceAssistant assistant) { this.assistant = assistant; }
    @GetMapping("/chat")
    public String chat(@RequestParam("message") String message) {
        return assistant.chat(message);
    }
}

6.3 Test Commands

Weather query:

curl "http://localhost:885/ai/chat?message=北京天气怎么样?"

Order status:

curl "http://localhost:885/ai/chat?message=我的订单123456现在什么状态?"
Weather test result
Weather test result

7. Common Issues

Model does not support tools – error 400 Bad Request: {"error":"... does not support tools"}. Solution: switch to an official Ollama model that supports Function Calling (e.g., qwen2:7b).

Bean name conflict – ConflictingBeanDefinitionException when both @AiService and manual @Bean are used. Solution: choose one method; manual bean creation is recommended.

Missing systemMessage method – occurs if manual bean construction omits systemMessage(). Solution: keep the @SystemMessage annotation on the interface method or provide a systemMessageProvider().

8. Summary

Prefer Ollama official models that support Function Calling (e.g., qwen2:7b).

Define each tool with a clear description and annotate parameters with @P.

Manage dependencies via a BOM to keep LangChain4j versions consistent.

Manually create AiServices beans to avoid component‑scan conflicts.

Place system prompts in @SystemMessage on the interface for declarative style.

Increase read timeout for slower local model inference.

Validate model tool support with a direct curl call before integrating into Spring.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

JavaLLMspring-bootFunction CallingAI IntegrationOllamaLangChain4j
The Dominant Programmer
Written by

The Dominant Programmer

Resources and tutorials for programmers' advanced learning journey. Advanced tracks in Java, Python, and C#. Blog: https://blog.csdn.net/badao_liumang_qizhi

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.