Build a MiniManus AI Agent in 10 Minutes with Qwen3, Qwen‑Agent, and MCP

This tutorial walks through registering API keys, setting up a conda environment, integrating the Firecrawl MCP server, writing Qwen‑Agent code, and extending the agent with Amap MCP to create a multi‑functional MiniManus AI application in roughly ten minutes.

Fun with Large Models
Fun with Large Models
Fun with Large Models
Build a MiniManus AI Agent in 10 Minutes with Qwen3, Qwen‑Agent, and MCP

Project preparation

Required steps before coding:

Register a Qwen3 API key on the Alibaba Bailei large‑model cloud platform.

Create a Conda virtual environment and install uv and Qwen-Agent Python packages.

Obtain a Firecrawl API key from the Firecrawl service (URL: https://www.firecrawl.dev/).

Agent initialization

Define a function that creates an Assistant instance, configures the large language model, and registers MCP services.

from qwen_agent.agents import Assistant
from qwen_agent.utils.output_beautify import typewriter_print

def init_agent_service():
    llm_cfg = {
        'model': 'qwen3-235b-a22b',
        'model_server': 'dashscope',
        'api_key': 'your Bailei API key',
        'generate_cfg': {'top_p': 0.8}
    }
    tools = [{
        "mcpServers": {
            "firecrawl-mcp": {
                "command": "npx",
                "args": ["-y", "firecrawl-mcp"],
                "env": {"FIRECRAWL_API_KEY": "your Firecrawl API key"}
            }
        }
    }]
    system = """\
        You are a planner and data analyst capable of extracting web information for analysis.
    """
    bot = Assistant(
        llm=llm_cfg,
        name='智能助理',
        description='具备查询高德地图、提取网页信息、数据分析的能力',
        system_message=system,
        function_list=tools,
    )
    return bot

Query workflow

Define a function that launches the built‑in Gradio WebUI with preset prompt suggestions.

def run_query(query=None):
    bot = init_agent_service()
    from qwen_agent.gui import WebUI
    chatbot_config = {
        'prompt.suggestions': [
            "https://github.com/orgs/QwenLM/repositories 提取这一页的Markdown 文档,然后绘制一个柱状图展示每个项目的收藏量",
            "帮我查询从故宫去颐和园的路线"
        ]
    }
    WebUI(bot, chatbot_config=chatbot_config).run()

if __name__ == '__main__':
    run_query()

Running the script starts the WebUI at localhost:7860. The right‑hand panel displays the prompt suggestions; selecting a suggestion sends the request to the Firecrawl MCP service, which fetches the GitHub repository page, extracts Markdown, and generates a bar chart of star counts. The second suggestion triggers the Amap MCP service (when added) to return a travel route.

Extending the agent with Amap

To add travel‑planning capabilities, append an Amap MCP server definition to the tools list and update the system prompt.

tools = [{
    "mcpServers": {
        "firecrawl-mcp": {
            "command": "npx",
            "args": ["-y", "firecrawl-mcp"],
            "env": {"FIRECRAWL_API_KEY": "your Firecrawl API key"}
        },
        "amap-mcp-server": {
            "command": "npx",
            "args": ["-y", "@amap/amap-maps-mcp-server"],
            "env": {"AMAP_MAPS_API_KEY": "your Amap API key"}
        }
    }
}]

system = """\
    你是一个规划师和数据分析师
    你可以调用高德地图规划旅行路线,同时可以提取网页信息进行数据分析
"""

Update the prompt.suggestions array to include a travel‑route request, e.g., '帮我查询从故宫去颐和园的路线'. After these modifications, the agent can both analyze GitHub statistics and provide Amap travel routes.

Conclusion

The combination of Qwen‑Agent, MCP servers, and the Gradio‑based WebUI enables rapid development of multi‑functional AI agents with minimal code. The MCP SDK 1.8.0 release adds HTTP streaming support, facilitating enterprise‑grade, concurrent MCP tool deployments.

MCPWebUIAmapQwen3Qwen-AgentFirecrawl
Fun with Large Models
Written by

Fun with Large Models

Master's graduate from Beijing Institute of Technology, published four top‑journal papers, previously worked as a developer at ByteDance and Alibaba. Currently researching large models at a major state‑owned enterprise. Committed to sharing concise, practical AI large‑model development experience, believing that AI large models will become as essential as PCs in the future. Let's start experimenting now!

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.