How to Build a Qwen3‑Powered ChatBI Agent with PAI‑LangStudio and Hologres
This guide walks you through creating a ChatBI intelligent agent by integrating Alibaba's Qwen3 large language model with PAI‑LangStudio, configuring the Model Context Protocol (MCP) server, and connecting to Hologres real‑time data warehouse, covering setup, deployment, and verification steps for enterprise data analysis.
Overview
This article explains how to use PAI‑LangStudio together with the Qwen3 large language model to build a ChatBI intelligent agent that leverages the Model Context Protocol (MCP) and Hologres for smart data analysis.
Background
Qwen3
Qwen3 is the latest generation of the Qwen series LLM, offering dense and mixture‑of‑experts (MOE) variants. Its key features include:
Seamless switching between thinking mode (for complex reasoning, math, and code) and non‑thinking mode (for efficient general‑purpose dialogue).
Significantly improved reasoning ability, surpassing previous Qwen models in mathematics, code generation, and common‑sense logic.
Strong agent capabilities, enabling precise tool integration in both modes and leading performance among open‑source models.
Support for over 100 languages and dialects with robust multilingual understanding and generation.
PAI‑LangStudio – Large Model Development Platform
PAI‑LangStudio is an enterprise‑grade, one‑stop large‑model application development platform built on Alibaba Cloud PAI core capabilities. It simplifies the development workflow, provides programmable and real‑time debugging features, and supports end‑to‑end AI application deployment, including stateful multi‑agent workflows via PAI‑EAS.
Model Context Protocol (MCP)
Released by Anthropic in 2024, MCP standardizes the interface between LLMs and external tools or data sources, decoupling model decision logic from resources and enabling a "intelligent brain + external limbs" architecture.
Hologres – Real‑time Data Warehouse
Hologres is Alibaba Cloud's real‑time data warehouse offering massive OLAP capabilities, low‑latency serving, and deep integration with the Proxima vector computation library.
Prerequisites
VPC, subnet, and security group have been created (both Hologres MCP Server and LangStudio Agent must reside in the same region).
Access to Alibaba Cloud Function Compute and the Function AI MCP service.
PAI console access to create a workspace if none exists.
Step 1 – Deploy Qwen3 Model in Model Gallery
In the PAI console, navigate to Model Gallery → Large Language Model and select the appropriate Qwen3 model (e.g., Qwen3‑8B). Enable the tool‑call configuration for agent nodes.
Refer to the model configuration table for recommended settings.
Step 2 – Set Up Hologres MCP Server
Create a Hologres instance (or use an existing one) and import sample data. Then, in Function Compute, create an MCP Server service that connects to the Hologres instance.
Configure the region, RAM role, endpoint, and database, then deploy the project to obtain an SSE service endpoint and Bearer token.
Step 3 – Build the LangStudio Application Flow
Create a new runtime in LangStudio, then add a new connection of type "General LLM Model Service" pointing to the PAI‑EAS model service.
Next, create an application flow from the "Intelligent Data Agent" template.
Ensure the MCP Server and LangStudio runtime are in the same region to avoid VPC connectivity issues.
Step 4 – Configure Agent Settings
In the flow dashboard, set the agent strategy to FunctionCalling , select the previously created model connection, enable conversation history, and provide the MCP SSE endpoint and Bearer token.
Example MCP configuration (JSON) to be added as an input variable:
{
"mcpServers": {
"remote-server": {
"type": "sse",
"url": "https://xxx.xxx.run/sse",
"headers": {
"Authorization": "Bearer xxxx"
},
"timeout": 30
}
}
}Set system prompts, user prompts, and maximum loop count (e.g., 5).
Step 5 – Deploy EAS Model Service for API Inference
After completing the flow development, click Deploy , choose an appropriate instance type and VPC, and confirm to create the PAI‑EAS model service.
Verification
Use the LangStudio chat interface to ask questions such as "What tools does MCP provide?" or "Show the top‑10 customers by quantity" and observe the generated responses.
Conclusion
By following these steps, you can quickly build a Qwen3‑based ChatBI agent with MCP and Hologres using PAI‑LangStudio, enabling intelligent data analysis for enterprise scenarios.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
Alibaba Cloud Big Data AI Platform
The Alibaba Cloud Big Data AI Platform builds on Alibaba’s leading cloud infrastructure, big‑data and AI engineering capabilities, scenario algorithms, and extensive industry experience to offer enterprises and developers a one‑stop, cloud‑native big‑data and AI capability suite. It boosts AI development efficiency, enables large‑scale AI deployment across industries, and drives business value.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
