Deploy Deepseek‑R1 with Ollama and Open‑WebUI on a Linux Server

This guide walks you through installing Ollama on Linux, configuring its service, pulling the Deepseek‑R1 model, and exposing a friendly Open‑WebUI Docker interface for interactive AI chat and private knowledge‑base uploads.

Full-Stack DevOps & Kubernetes
Full-Stack DevOps & Kubernetes
Full-Stack DevOps & Kubernetes
Deploy Deepseek‑R1 with Ollama and Open‑WebUI on a Linux Server

Step 1: Install Ollama

Download and install Ollama on a Linux host: curl -fsSL https://ollama.com/install.sh | sh Edit the systemd service file to expose the service on all interfaces and allow any origin:

vi /etc/systemd/system/ollama.service
# add or modify the following lines
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"

Reload the daemon, restart the service, and verify that port 11434 is listening:

systemctl daemon-reload
systemctl restart ollama
ss -anlp | grep 11434

Open a browser and navigate to http://YOUR_SERVER_IP:11434; the page should display “Ollama is running”.

Step 2: Deploy Deepseek‑R1 model

Pull the desired Deepseek‑R1 model (e.g., the 7B variant) using Ollama: ollama run deepseek-r1:7b Verify that the model has been downloaded: ollama list Start an interactive chat session with the model:

ollama run deepseek-r1

Step 3: Set up a visual interface (Open‑WebUI)

Install Docker CE (example for CentOS 7):

wget http://mirrors.aliyun.com/docker-ce/linux/centos/docker-ce.repo
yum install -y docker-ce-20.10.9-3.el7 docker-ce-cli-20.10.9-3.el7 containerd.io
systemctl start docker
docker --version
docker info

Run the Open‑WebUI container, pointing it to the Ollama endpoint:

docker run -d -p 3000:8080 \
  -e OLLAMA_BASE_URL=http://YOUR_SERVER_IP:11434 \
  -v open-webui:/app/backend/data \
  --name open-webui \
  --restart always \
  ghcr.io/open-webui/open-webui:main

Access the UI at http://YOUR_SERVER_IP:3000. The interface allows uploading files to create a private knowledge base for the model.

After completing these steps, Deepseek‑R1 runs on the Linux host and can be accessed through the Open‑WebUI for both chat and knowledge‑base queries.

LinuxOllamaOpen WebUI
Full-Stack DevOps & Kubernetes
Written by

Full-Stack DevOps & Kubernetes

Focused on sharing DevOps, Kubernetes, Linux, Docker, Istio, microservices, Spring Cloud, Python, Go, databases, Nginx, Tomcat, cloud computing, and related technologies.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.