How to Connect a XiaoAI Speaker to a Large Language Model

This guide walks through preparing a XiaoAI speaker, selecting a free LLM service, creating an API key, installing Docker, running the MiGPT server, and configuring the speaker to query the chosen large language model.

Infra Learning Club
Infra Learning Club
Infra Learning Club
How to Connect a XiaoAI Speaker to a Large Language Model

Preparation

Verify whether the XiaoAI speaker model is supported (source lists "undefined").

Set up the XiaoAI speaker in the Mi Home app, then obtain the Mi Home account ID and password for later configuration.

AI Service Selection

The project supports multiple AI providers. Free‑tier options include:

Groq – open‑source models (Llama, Gemma, etc.) with no charge.

SiliconFlow – open‑source models (Yi, Qwen, Llama, Gemma, etc.) with a completely free tier and an additional 14 CNY trial credit.

Paid options (ChatGPT, Zhipu AI, Doubao, Tongyi Qianwen, DeepSeek, Zero One Everything, Baichuan, Moonshot) have token‑based pricing ranging from 0.1 CNY to 12 CNY per million tokens. Because SiliconFlow offers a fully free tier, it is selected for the integration.

Create SiliconFlow Account

Register a SiliconFlow account and log in (source lists "undefined").

Generate an API key via Account Management → API Keys → New API Key .

In the model marketplace, choose a model (e.g., Yi, Qwen, Llama, Gemma) and record its exact name; no deployment step is required.

Start MiGPT Server

Install Docker. The author recommends Orbstack ( https://orbstack.dev/) as a Docker Desktop alternative.

Pull the server image:

docker pull docker.m.daocloud.io/lmk123/migpt-server

Run the container:

docker run -d --name migpt-server -p 36592:36592 docker.m.daocloud.io/lmk123/migpt-server

Open the GUI in a browser at http://localhost:36592. If the container runs on a remote host, replace localhost with the host IP (e.g., http://192.168.1.1:36592).

Configure MiGPT

Enter speaker information: select the speaker model, then provide the Xiaomi account ID and password.

Configure the AI service (SiliconFlow example):

Endpoint URL: https://api.siliconflow.cn/v1 API key: the key generated in the previous step.

Model name: the exact model identifier recorded from the marketplace.

Usage

After configuration, the XiaoAI speaker can query the selected large language model via the MiGPT GUI. A demonstration video is referenced in the original source.

References

[1] ChatGPT – https://migptgui.com/docs/apply/chatgpt

[2] Groq – https://migptgui.com/docs/apply/groq

[3] SiliconFlow – https://migptgui.com/docs/apply/siliconflow

[4] Zhipu AI – https://migptgui.com/docs/apply/zhipu

[5] Doubao – https://migptgui.com/docs/apply/doubao

[6] Tongyi Qianwen – https://migptgui.com/docs/apply/tongyi

[7] DeepSeek – https://migptgui.com/docs/apply/deepseek

[8] Zero One Everything – https://migptgui.com/docs/apply/lingyi

[9] Baichuan (Baixiao) – https://migptgui.com/docs/apply/baichuan

[10] Moonshot (Kimi) – https://migptgui.com/docs/apply/moonshot

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

Dockerlarge language modelSiliconFlowXiaoAIMiGPT
Infra Learning Club
Written by

Infra Learning Club

Infra Learning Club shares study notes, cutting-edge technology, and career discussions.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.