Artificial Intelligence 12 min read

Build a Local AI Knowledge Base with Ollama, DeepSeek‑R1, AnythingLLM & VSCode

This guide walks you through setting up a powerful local AI knowledge base using Ollama, DeepSeek‑R1, and AnythingLLM, and shows how to integrate the Continue extension into VSCode for seamless, secure, and efficient development workflows.

JD Cloud Developers
JD Cloud Developers
JD Cloud Developers
Build a Local AI Knowledge Base with Ollama, DeepSeek‑R1, AnythingLLM & VSCode

1. Ollama + DeepSeek‑R1 + AnythingLLM Local Knowledge Base Setup

Ollama, DeepSeek‑R1, and AnythingLLM form the core of a local knowledge‑base system. Ollama manages large language models (LLMs) on your machine, DeepSeek‑R1 provides strong language understanding and generation, and AnythingLLM offers a unified AI application for retrieval‑augmented generation (RAG) and AI agents.

2. Ollama Installation

2.1 Before Installing

Ollama is an open‑source tool for running and managing LLMs locally, allowing deployment without cloud services. It simplifies model download, installation, and management via command line or API.

2.2 Download

Visit https://ollama.com/ , click Download, and choose the version matching your computer (e.g., macOS for Apple M3 Pro).

2.3 Install

Run the installer, launch Ollama, and verify it’s running by opening

http://localhost:11434/

in a browser; the page should display “Ollama is running”.

3. DeepSeek‑R1 Installation

3.1 Download Options

DeepSeek‑R1 can be downloaded via Ollama ( https://ollama.com/library/deepseek-r1 ) or Hugging Face. For Apple M3 Pro with 18 GB RAM, the 1.5 B model is recommended for lightweight tasks, while the 7 B model can be used for more demanding workloads.

3.2 Install

Run the following command in a terminal:

<code>ollama run deepseek-r1:7b</code>

Wait a few minutes for the model to download. After completion, test it by typing “You are who?” in the terminal.

4. AnythingLLM Installation

4.1 What Is AnythingLLM?

AnythingLLM is a zero‑setup, self‑hosted AI application that combines local LLMs, RAG, and AI agents in a single interface, built with JavaScript for easy extension.

4.2 Install

Download the desktop client from https://anythingllm.com/desktop , launch it, and select Ollama as the LLM provider, choosing the installed DeepSeek‑R1 model (e.g., 7 B).

4.3 Feeding Documents

Upload local documents (e.g., a Word file) via the workspace’s upload button, move them to the workspace, and then query the knowledge base (e.g., “Summarize the main points of the document in under 100 words”).

4.4 Browser Extension for Document Ingestion

Install the “AnythingLLM Browser Companion” Chrome extension, generate an API key in the AnythingLLM client, paste the connecting string into the extension, and use “Embed entire page to workspace” to ingest web pages.

5. Continue Integration in VSCode

Continue is an open‑source AI code assistant for VSCode and JetBrains, offering chat, inline code suggestions, and editing without leaving the editor.

5.1 Install VSCode

Download and install VSCode from https://code.visualstudio.com/download if not already installed.

5.2 Install Continue Extension

In VSCode’s Extensions view, search for “Continue”, click Install, and restart VSCode.

5.3 Configure Continue

Open Continue, click “+ Add Chat model”, set Provider to “Ollama”, Model to “Autodetect”, and click Connect.

Test the setup by asking “What model are you?” and verify the response.

6. Conclusion

Integrating Ollama, DeepSeek‑R1, AnythingLLM, and Continue creates a secure, locally hosted AI knowledge base that protects sensitive data and boosts development productivity, though it requires a capable machine to run smoothly.

DeepSeekVSCodeOllamaAnythingLLMLocal LLMAI Knowledge Base
JD Cloud Developers
Written by

JD Cloud Developers

JD Cloud Developers (Developer of JD Technology) is a JD Technology Group platform offering technical sharing and communication for AI, cloud computing, IoT and related developers. It publishes JD product technical information, industry content, and tech event news. Embrace technology and partner with developers to envision the future.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.