Build a Local Go AI Agent with Ollama and DeepSeek – MVP Guide
This article walks you through creating a fully offline, extensible AI programming assistant in Go, using Ollama and DeepSeek‑R1, covering project layout, message formats, function calling, tool integration, a simple WebSocket UI, and future extension ideas.
Overview
This article demonstrates how to build a locally‑run AI Agent in pure Go using Ollama with the DeepSeek‑R1 model. The agent can converse, invoke tools (code execution, file I/O, web search, etc.), and be extended further.
Project Structure (Minimal MVP)
easy-agent/
├── main.go
├── agent/
│ ├── agent.go
│ ├── agency.go
│ ├── memory.go
│ ├── ollama_client.go
│ └── tools.go
└── client/
└── index.htmlagent/ : core logic of the AI Agent
client/ : simple WebSocket front‑end for testing
ollama_client.go : wrapper for calling a local LLM via Ollama
agency.go : main agent loop that handles function calls
tools.go : definitions of tool functions
memory.go : lightweight in‑memory store
Message Format and Function Calling
The LLM exchanges a list of messages. Example messages:
{"role": "user", "content": "帮我解释下面代码"}
{"role": "assistant", "content": "…"}
{"role": "tool", "name": "run_code", "content": "执行结果…"}When a tool is required the model returns a function_call object:
{
"message": {
"role": "assistant",
"function_call": {
"name": "run_code",
"arguments": "{\"language\":\"go\",\"code\":\"...\"}"
}
}
}Processing steps:
Parse the function_call field.
Execute the corresponding tool (e.g., run code).
Append a new message with role: tool and the tool output.
Call the model again with the updated message list.
Repeat until the model produces a final answer.
Ollama Client (High‑Quality Version)
File:
agent/ollama_client.go type OllamaClient struct {
Endpoint string
Client *http.Client
Model string
}
func NewOllamaClient(endpoint string, timeout time.Duration, model string) *OllamaClient {
return &OllamaClient{Endpoint: endpoint, Client: &http.Client{Timeout: timeout}, Model: model}
}
func (c *OllamaClient) Call(ctx context.Context, messages []ChatMessage, tools any) (*ChatResponse, error) {
reqBody := ChatRequest{Model: c.Model, Messages: messages, Tools: tools, ToolChoice: "auto"}
b, _ := json.Marshal(reqBody)
req, _ := http.NewRequestWithContext(ctx, "POST", c.Endpoint, bytes.NewReader(b))
req.Header.Set("Content-Type", "application/json")
resp, err := c.Client.Do(req)
if err != nil { return nil, err }
defer resp.Body.Close()
var cr ChatResponse
json.NewDecoder(resp.Body).Decode(&cr)
return &cr, nil
}This client can call any Ollama‑compatible model such as DeepSeek‑R1, Qwen, or LLaMA.
Agent Loop (Core Reasoning Engine)
File:
agent/agency.go func (a *Agent) Run(prompt string) (string, error) {
messages := []ChatMessage{{Role: "system", Content: "你是 AI 编程伙伴,遇到需要执行的任务请调用工具"}, {Role: "user", Content: prompt}}
for i := 0; i < 6; i++ {
cr, _ := a.client.Call(context.Background(), messages, toolsMetadata())
msg := cr.Choices[0].Message
if msg.FunctionCall != nil {
// forward the function call to the tool system
messages = append(messages, ChatMessage{Role: msg.Role, Name: msg.FunctionCall.Name})
result := a.execTool(msg.FunctionCall)
messages = append(messages, ChatMessage{Role: "tool", Name: msg.FunctionCall.Name, Content: result})
continue
}
return msg.Content, nil
}
return "", fmt.Errorf("loop limit")
}The loop adds system and user messages, lets the model think, executes any requested tool, feeds the result back, and repeats until a final answer is produced.
Tool System
File:
agent/tools.go func (a *Agent) execTool(fc *FunctionCall) string {
switch fc.Name {
case "run_code":
return RunCodeSandbox(fc.Arguments)
case "read_file":
return ReadFile(fc.Arguments)
case "web_search":
return WebSearch(fc.Arguments)
default:
return "unknown tool"
}
}Tool metadata is supplied to the model as a JSON schema:
func toolsMetadata() any {
return []map[string]any{{
"type": "function",
"function": map[string]any{
"name": "web_search",
"description": "联网搜索相关信息",
"parameters": map[string]any{
"type": "object",
"properties": map[string]any{
"query": map[string]any{"type": "string"},
"num_results": map[string]any{"type": "integer"},
},
"required": []string{"query"},
},
},
}}
}Adding a Web Search Tool (Example Stub)
func WebSearch(args WebSearchArgs) ([]SearchResult, error) {
// Use SerpAPI, DuckDuckGo, Bing API, or scrape search pages.
return results, nil
}When the model decides to call web_search, it sends a function call such as:
{"function_call": {"name": "web_search", "arguments": "{\"query\":\"Go map race condition\",\"num_results\":3}"}}The Agent runs the search, feeds the results back, and the model returns a concise answer.
Frontend Debugging Interface (WebSocket Streaming)
Run the agent: go run main.go Open client/index.html in a browser. The UI streams text from the agent and allows you to submit queries such as “帮我分析这段 Go 程序”. The data flow is:
Agent → WebSocket → Model → Tool → Model → Browser
Source Code
GitHub: https://github.com/louis-xie-programmer/easy-agent
Gitee: https://gitee.com/louis_xie/easy-agent
Code Wrench
Focuses on code debugging, performance optimization, and real-world engineering, sharing efficient development tips and pitfall guides. We break down technical challenges in a down-to-earth style, helping you craft handy tools so every line of code becomes a problem‑solving weapon. 🔧💻
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
