Deploy and Fine‑Tune Qwen1.5 LLM with Alibaba PAI‑QuickStart

This article introduces Alibaba Cloud's open‑source Qwen1.5 large language model series, highlights its multilingual, human‑preference alignment, and long‑context capabilities, and provides step‑by‑step guidance on using PAI‑QuickStart for model deployment, fine‑tuning, and Python SDK integration.

Alibaba Cloud Big Data AI Platform
Alibaba Cloud Big Data AI Platform
Alibaba Cloud Big Data AI Platform
Deploy and Fine‑Tune Qwen1.5 LLM with Alibaba PAI‑QuickStart

Qwen1.5 Series Model Introduction

Qwen1.5 (Tongyi Qianwen 1.5) is Alibaba Cloud's latest open‑source large language model series, expanding on the 1.0 version with sizes ranging from 0.5B to 72B parameters. The series offers both Base and Chat variants and is fully supported by Alibaba Cloud's AI platform PAI.

Multilingual capability : significantly improved support for a wider range of languages and complex scenarios.

Human‑preference alignment : enhanced using Direct Preference Optimization (DPO) and Proximal Policy Optimization (PPO).

Long‑sequence support : all model sizes handle up to 32,768 tokens of context.

Benchmark tests show strong performance in language understanding, code generation, reasoning, multilingual processing, and alignment with human preferences.

PAI‑QuickStart Introduction

PAI‑QuickStart is a component of Alibaba Cloud's AI platform PAI that bundles high‑quality open‑source pretrained models, including large language models, text‑to‑image, and speech recognition. It enables zero‑code or SDK‑based end‑to‑end training, deployment, and inference, simplifying AI development for developers and enterprises.

Runtime Environment Requirements

Supported regions: Beijing, Shanghai, Shenzhen, Hangzhou (Alibaba Cloud).

Resource requirements:

Qwen1.5‑0.5B/1.4B/4B/7B: minimum V100/P40/T4 (16 GB VRAM) for QLoRA fine‑tuning.

Qwen1.5‑14B: minimum V100 (32 GB VRAM) or A10 for QLoRA fine‑tuning.

Using PAI‑QuickStart with the Model

In the PAI console, select the Qwen1.5‑7B‑Chat model card (see image). Deploy the model to the PAI‑EAS inference service by providing a service name and resource configuration. The deployed service can be accessed via a ChatLLM WebUI.

For fine‑tuning, PAI provides a built‑in algorithm that accepts JSON‑formatted data with instruction and output fields. Example data:

[
    {
        "instruction": "写一首以“寓居夜感”为题的诗:",
        "output": "独坐晚凉侵,客窗秋意深。..."
    },
    {
        "instruction": "写一首以“次答友人思乡诗”为题的诗:",
        "output": "阅尽沧桑万事空,何如归卧夕阳中。..."
    }
]

Upload the prepared dataset to an OSS bucket, then launch training on GPUs (V100/P40/T4 with 16 GB VRAM). Hyper‑parameter settings can be customized or left at defaults.

After training, the model can be deployed to PAI‑EAS with a single click, using the same invocation method as direct deployment.

Python SDK Usage

The PAI Python SDK allows programmatic access to model deployment and fine‑tuning. Example code to deploy a model:

from pai.model import RegisteredModel

# Get the model provided by PAI
model = RegisteredModel(
    model_name="qwen1.5-7b-chat",
    model_provider="pai"
)

# Deploy the model
predictor = model.deploy(service="qwen7b_chat_example")
print(predictor.console_uri)

Example code to start fine‑tuning:

# Get the fine‑tuning estimator
est = model.get_estimator()
training_inputs = model.get_estimator_inputs()
# Optionally update with custom data paths
# training_inputs.update({"train": "<OSS or local path>", "validation": "<path>"})
est.fit(inputs=training_inputs)
print(est.model_data())

Detailed notebook examples are available via the “Open in DSW” link on the model card.

Conclusion

Qwen1.5 marks Alibaba Cloud's latest advancement in open‑source LLMs, offering multiple model sizes for diverse downstream applications. Developers can quickly customize, fine‑tune, and deploy these models using PAI‑QuickStart, which also aggregates a variety of advanced models across many domains.

Related Resources

Qwen1.5 introduction: https://qwenlm.github.io/zh/blog/qwen1.5/

PAI QuickStart guide: https://help.aliyun.com/zh/pai/user-guide/quick-start-overview

PAI Python SDK GitHub: https://github.com/aliyun/pai-python-sdk

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

Model deploymentfine-tuningPAI-QuickStartQwen1.5
Alibaba Cloud Big Data AI Platform
Written by

Alibaba Cloud Big Data AI Platform

The Alibaba Cloud Big Data AI Platform builds on Alibaba’s leading cloud infrastructure, big‑data and AI engineering capabilities, scenario algorithms, and extensive industry experience to offer enterprises and developers a one‑stop, cloud‑native big‑data and AI capability suite. It boosts AI development efficiency, enables large‑scale AI deployment across industries, and drives business value.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.