Artificial Intelligence 3 min read

Build a Local AI Assistant with DeepSeek and Ollama in 10 Minutes

This guide walks you through installing Ollama, downloading the DeepSeek model, and configuring the Chatbox AI client so you can run a powerful local AI assistant on Windows, macOS, or Linux within minutes.

JD Cloud Developers
JD Cloud Developers
JD Cloud Developers
Build a Local AI Assistant with DeepSeek and Ollama in 10 Minutes

If you want to quickly create a local AI assistant, replace OpenAI with DeepSeek and get a ready‑to‑use solution without complex setup.

1. Install DeepSeek

DeepSeek runs on top of Ollama , an open‑source framework designed for easy deployment of large language models (LLMs) on local machines. Ollama simplifies Docker‑based model deployment and works on macOS, Linux, and Windows. Download it from https://ollama.com/ .

After installing Ollama, search for the DeepSeek model and run the following command in the console:

ollama run deepseek-r1:7b

Once the model is downloaded, your computer becomes a small DeepSeek server ready to answer queries.

If you have multiple models, you can list and run them using

ollama list

and

ollama run <model>

.

2. Install Chatbox

Chatbox AI is a client application that supports many AI models and APIs, available on Windows, macOS, Android, iOS, Linux, and the web.

Download and launch Chatbox, then set the language to Chinese. In the model settings, select the DeepSeek model you just installed and save the configuration.

After saving, you can start using the various features of Chatbox with your local DeepSeek model.

DeepseekAI assistantOllamaChatboxLocal LLM
JD Cloud Developers
Written by

JD Cloud Developers

JD Cloud Developers (Developer of JD Technology) is a JD Technology Group platform offering technical sharing and communication for AI, cloud computing, IoT and related developers. It publishes JD product technical information, industry content, and tech event news. Embrace technology and partner with developers to envision the future.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.