Artificial Intelligence 6 min read

Deploying Ollama on Windows and Linux and Integrating with SpringBoot

This guide explains how to download, install, and configure Ollama on Windows and Linux, set up environment variables, select a DeepSeek model, and call the Ollama API from a SpringBoot application with example code snippets.

Selected Java Interview Questions
Selected Java Interview Questions
Selected Java Interview Questions
Deploying Ollama on Windows and Linux and Integrating with SpringBoot

Ollama is an open‑source framework designed for easy local deployment and execution of large language models (LLMs).

Download Ollama : The official download page (https://ollama.com/download) provides binaries for macOS, Linux, and Windows. The GitHub releases page (https://github.com/ollama/ollama/releases) can also be used. For example, on Windows you can fetch the installer with:

wget https://github.com/ollama/ollama/releases/download/v0.5.8-rc10/OllamaSetup.exe

After downloading, run the installer (default to C:\, requiring at least 10 GB free). Verify the installation by opening a command prompt and typing ollama ; a usage message indicates success. Close any running Ollama process before proceeding.

Configure environment variables to control model storage and network settings:

OLLAMA_MODELS – path where models are stored, e.g., D:\ollama\models

OLLAMA_HOST – optional custom host, e.g., 0.0.0.0:8080 (useful when a UI expects port 8080)

OLLAMA_ORIGINS – optional CORS setting, e.g., *

Select and download a DeepSeek model . Visit the model library (https://ollama.com/library/deepseek-r1) and choose a size that matches your hardware. For a 1.5 B parameter model, run:

ollama run deepseek-r1:1.5b

The command pulls the model and makes it ready for inference.

Linux deployment can be performed with a single script:

curl -fsSL https://ollama.com/install.sh | sh
sudo systemctl enable ollama
sudo systemctl status ollama
sudo systemctl start ollama
ollama run deepseek-r1:1.5b

API integration (SpringBoot example) :

The Ollama server exposes a comprehensive REST API (see https://github.com/ollama/ollama/blob/main/docs/api.md). The following utility class sends a generation request to the /api/generate endpoint:

package com.example.springtestdemo.ai.util;

import com.alibaba.fastjson.JSONObject;
import lombok.extern.slf4j.Slf4j;

@Slf4j
public class OllamaUtil {
    private static String DOMAIN = "http://localhost:11434/api/generate";

    public static String chatDeepSeek(String model, String question) {
        String url = DOMAIN;
        JSONObject body = new JSONObject();
        body.put("model", model);
        body.put("prompt", question);
        body.put("stream", false);
        String result = CommonUtil.postJson(url, body.toJSONString());
        log.info("【ollama-请求】 结果 {}", result);
        try {
            JSONObject resJson = JSONObject.parseObject(result);
            String response = resJson.getString("response");
            log.info("【ollama-请求】 结果 {}", response);
            return response;
        } catch (Exception e) {
            log.error("【ollama-请求】异常", e);
        }
        return "ok";
    }
}

The corresponding controller exposes a POST endpoint:

package com.example.springtestdemo.ai.controller;

import com.example.springtestdemo.ai.QueryParam;
import com.example.springtestdemo.ai.util.OllamaUtil;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class AiController {
    @PostMapping("/deepSeek")
    public String deepSeek(QueryParam query) {
        String res = OllamaUtil.chatDeepSeek("deepseek-r1:1.5b", query.getQuestion());
        return res;
    }
}

For a graphical UI, the article suggests using an Edge browser extension (https://www.crxsoso.com/webstore/detail/jfgfiigpkhlkbnfnbobbkinehhfdhndo) that supports image upload and other features.

Overall, the guide provides step‑by‑step instructions for installing Ollama, configuring it, pulling a DeepSeek model, and calling the service from Java code.

LLMDeploymentLinuxAPISpringBootDeepSeekwindowsOllama
Selected Java Interview Questions
Written by

Selected Java Interview Questions

A professional Java tech channel sharing common knowledge to help developers fill gaps. Follow us!

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.