Deploy and Fine‑Tune EasyAnimate for High‑Res Video Generation on Alibaba Cloud PAI
EasyAnimate is Alibaba Cloud PAI's DiT video generation framework that provides a complete HD video generation solution, and this guide walks you through integrating EasyAnimate on PAI, setting up prerequisites, creating DSW instances, installing the model, performing inference via code or WebUI, fine‑tuning LoRA, and using the API.
EasyAnimate is Alibaba Cloud PAI's self‑developed DiT video generation framework providing a full HD long‑video generation pipeline (data preprocessing, VAE training, DiT training, inference, evaluation).
Background
The tutorial describes two ways to generate videos: using DSW (Data Science Workspace) and using QuickStart.
Cost
New DSW/EAS users can try for free; see the free‑quota documentation. Existing users will be charged according to the resources used (e.g., ecs.gn7i-c8g1.2xlarge).
Prerequisites
Create a PAI workspace.
(Optional) Enable OSS or NAS.
Method 1: Using DSW
Step 1 – Create DSW Instance
Log in to the PAI console, select the target region, open the target workspace, and navigate to Model Development & Training > Interactive Modeling (DSW).
Click “Create Instance”.
Configure the instance. Example values: Parameter Description Instance Name AIGC_test_01 Resource Specification Select GPU spec ecs.gn7i-c8g1.2xlarge or other A10/GU100 specs. Image Official image easyanimate:1.1.4-pytorch2.2.0-gpu-py310-cu118-ubuntu22.04 Mount Configuration (optional) Click “Add” → “Create Dataset” to create OSS or NAS dataset.
Confirm the creation.
Step 2 – Install EasyAnimate Model
Open the DSW instance, go to the Notebook launcher, and click the “Quick Start” area under the DSW Gallery to open the EasyAnimate tutorial.
Run the environment‑installation cell, which defines functions, downloads code, and downloads the model. Execute each step sequentially.
Step 3 – Inference
Option A: Code inference – click the “Model Inference > Use Code for Inference” node and run the command. Results are saved in
/mnt/workspace/demos/easyanimate/EasyAnimate/samples/easyanimate-videos. Parameters can be adjusted: prompt – positive prompt. negative_prompt – negative prompt. num_inference_steps – number of steps. guidance_scale – guidance coefficient. width, height – video resolution. video_length – number of frames. fps – frame rate. save_dir – output directory. seed – random seed. lora_weight – LoRA weight. lora_path, transformer_path, motion_module_path – model file paths.
Option B: WebUI inference – click the “Model Inference > UI Launch” node, open the generated link, select the pretrained model path, the fine‑tuned base model, LoRA model, set LoRA weight and other parameters, then click Generate . The generated video appears on the right side for download.
Method 2: Using QuickStart
Step 1 – Deploy Model
Log in to the PAI console, open the workspace, and go to QuickStart.
Search “EasyAnimate HD video generation”, click Deploy , and configure parameters. EasyAnimate only supports bf16 inference; choose an A10 or higher GPU.
Confirm the deployment; wait until the status changes to “Running”.
Step 2 – Use Model
After deployment you can generate videos via WebUI or API.
WebUI
In the service detail page click “View Web App”.
Select the pretrained model and configure parameters as needed.
Click Generate and download the resulting video.
API
Obtain the service URL and token from the “Resource Details” section, then call the inference endpoint. Example Python code for updating the transformer, updating the edition, and performing inference is provided.
All generated files are stored under /mnt/workspace/demos/easyanimate/.
Alibaba Cloud Big Data AI Platform
The Alibaba Cloud Big Data AI Platform builds on Alibaba’s leading cloud infrastructure, big‑data and AI engineering capabilities, scenario algorithms, and extensive industry experience to offer enterprises and developers a one‑stop, cloud‑native big‑data and AI capability suite. It boosts AI development efficiency, enables large‑scale AI deployment across industries, and drives business value.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
