Cloud Computing 22 min read

2017 China Cloud Computing Evaluation: User Experience, Performance, and Usability Insights

The 2017 China Cloud Computing Evaluation Report by Tingyun analyzes major cloud providers across user experience, performance, availability, and ease of use, using a 132‑item questionnaire and verification tests to deliver regional performance scores and comprehensive provider rankings.

Efficient Ops
Efficient Ops
Efficient Ops
2017 China Cloud Computing Evaluation: User Experience, Performance, and Usability Insights

2017 China Cloud Computing Evaluation Report

In recent years, big data and artificial intelligence have revitalized cloud computing, expanding market size and prompting enterprises to focus on cloud selection. This report by Tingyun assesses major Chinese cloud providers from a user perspective, covering comprehensive user experience, performance & availability, and service usability.

1. Cloud Computing Comprehensive User Experience

The report uses a questionnaire covering 132 capabilities (compute, storage, network, elasticity, monitoring, support, data services) and a verification test where each provider runs a WordPress instance on Kubernetes, simulating real‑user traffic and applying stress‑IO workloads to measure CPU usage differences. Results are visualized in regional radar charts.

All cloud services are uniformly deployed and monitored via k8s; each server runs WordPress and is accessed by simulated users, while a pressure‑IO program stresses CPU to reveal performance differences. The verification architecture is illustrated below.

Key findings: In Shanxi and Shaanxi, overall network performance is best; Alibaba Cloud scores 28 in Shanxi (first‑screen 1.21 s, success rate 99.96 %). In Ningxia, AWS scores 29 (first‑screen 1.11 s, success rate 99.98 %). Similar detailed metrics are presented for Huawei Cloud, Kingsoft Cloud, Tencent Cloud, UCloud, Microsoft Cloud, Mobile Cloud, etc., highlighting DNS time, connection time, first‑packet time, latency and packet loss across provinces.

2. Cloud Computing Performance and Availability

Performance is measured by system‑CPU usage, user‑CPU usage, stolen CPU, IO‑wait CPU, disk I/O rates, database response time, and system load. Scores (0‑10) are derived from these metrics.

Alibaba Cloud’s advantage is zero stolen CPU usage but weaker system‑CPU (45.21 %) and disk‑write speed (32.18 MB/s). AWS excels in low system‑CPU (21.64 %), user‑CPU (7.91 %) and IO‑wait (0.000009 %) but lags in disk I/O. Huawei Cloud shows zero stolen CPU and high disk‑write (52.34 MB/s) yet higher system load (3.06) and system‑CPU (41.85 %). Kingsoft Cloud leads in database response time (0.21 ms) but has high system‑CPU (45.05 %). Tencent Cloud and UCloud both have zero stolen CPU; UCloud also shows low system load (1.08) and fast database response (0.17 ms). Microsoft Cloud and Mobile Cloud share zero stolen CPU; Mobile Cloud has the highest disk‑read rate (97.23 MB/s) while its system‑CPU (43.42 %) is the weakest.

3. Cloud Computing Ease of Use

Ease‑of‑use scores (0‑100) combine questionnaire results and verification data. Alibaba Cloud scores high on monitoring and network features; AWS is strong in monitoring, auto‑scaling, and network but lags in storage. Huawei Cloud’s auto‑scaling is its best feature. Kingsoft Cloud excels in auto‑scaling but needs better storage tagging and priority support. Tencent Cloud offers balanced support with strong auto‑scaling, network, and technical support, though storage services need improvement. UCloud highlights auto‑scaling and compute instance support but is weak on storage. Microsoft Cloud is balanced, with strong auto‑scaling but limited storage options. Mobile Cloud leads in technical support and disaster‑recovery, while its storage support requires enhancement.

Evaluation Metric Explanation

User‑experience metrics include first‑screen time, homepage load time, success rate, DNS time, connection time, first‑packet time, latency, and packet loss. Performance and availability metrics cover system‑CPU, user‑CPU, stolen CPU, IO‑wait CPU, database response time, disk I/O rates, server response time, and system load. Ease‑of‑use metrics involve data services, monitoring, auto‑scaling, network, storage, compute instances, and technical support.

user experiencecloud computingperformance evaluationavailabilityusability
Efficient Ops
Written by

Efficient Ops

This public account is maintained by Xiaotianguo and friends, regularly publishing widely-read original technical articles. We focus on operations transformation and accompany you throughout your operations career, growing together happily.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.