Cloud Computing 20 min read

Introduction to Serverless Architecture and Cloud Functions (SCF)

This article introduces serverless architecture and Tencent Cloud's SCF, explains how function‑as‑a‑service abstracts servers with pay‑per‑use, automatic scaling and trigger‑driven execution, and showcases typical scenarios such as API services, object processing, log analysis, message handling, and scheduled jobs.

Tencent Cloud Developer
Tencent Cloud Developer
Tencent Cloud Developer
Introduction to Serverless Architecture and Cloud Functions (SCF)

Through this article you can learn: first, an introduction to Serverless architecture; second, an overview of cloud function products; third, typical Serverless use cases.

The content is based on Huang Wenjun’s sharing at the Cloud+ Community Salon titled “Serverless Architecture Development and Serverless Cloud Function (SCF) Deployment Practice”. Huang previously worked on enterprise storage and container platforms and now leads the SCF product at Tencent Cloud.

The talk is divided into three parts: 1) Serverless architecture introduction; 2) Cloud function product overview; 3) Serverless application scenarios.

Before Serverless, computing evolved from physical servers (requiring hardware procurement, IDC networking, and manual operations) to virtual machines (IaaS) and then to container platforms (PaaS). With the rise of Function‑as‑a‑Service (FaaS), developers no longer need to manage underlying servers; they only focus on business logic.

Serverless architecture consists of two layers: Function‑as‑a‑Service (providing compute) and Backend‑as‑a‑Service (providing storage, databases, caching, etc.). Both layers abstract away the underlying infrastructure, allowing developers to use services directly without worrying about servers.

How Function‑as‑a‑Service works: developers upload code (cloud function) to the platform, configure triggers, and the function runs only when the trigger fires. The platform automatically creates instances to handle concurrent events and scales on demand. Billing is pay‑per‑use: you are charged only while the function is executing.

Automatic concurrency means the platform launches multiple instances automatically based on event volume, unlike traditional containers or VMs that require manual scaling. This on‑demand model also reduces costs during low‑traffic periods.

Developers only need to write business code and configure triggers; the underlying infrastructure, runtime environment, and OS maintenance are handled by the platform. Supported runtimes include Python, Node.js, PHP, Go, and Java.

Supported triggers cover scheduled timers, COS (object storage) events, CMQ message queues, API Gateway, and CKafka. These enable a wide range of scenarios such as API services, image processing, file packaging, log archiving, message processing, and periodic tasks.

Typical Serverless scenarios include:

API services: API Gateway forwards requests to cloud functions, which execute stateless business logic and interact with storage, databases, or caches.

Object file processing: functions react to COS upload/delete events for tasks like image resizing or format conversion.

File batch packaging: functions generate zip archives on demand after files are uploaded.

Log analysis: functions process logs stored in COS, extract metrics, and write results to databases.

Message handling: functions consume CMQ or CKafka messages, perform business logic, and optionally forward results.

Scheduled jobs: timer‑triggered functions perform health checks, backups, or periodic data aggregation.

In summary, Serverless allows developers to focus on business code, accelerating development and deployment. On‑demand execution and automatic scaling handle traffic spikes without manual capacity planning, while reducing operational overhead for both developers and ops teams.

Q&A highlights:

Resource limits can be set via support tickets to cap concurrency.

API services can be built either as a single function handling all routes or as multiple micro‑functions per endpoint.

Asynchronous triggers (CMQ, CKafka) do not provide immediate execution feedback; results can be communicated via downstream queues or callback APIs.

Heavy workloads like video transcoding should be offloaded to dedicated services, with functions acting as orchestration glue.

FaaSserverlesscloud computingevent-drivenCloud FunctionsSCF
Tencent Cloud Developer
Written by

Tencent Cloud Developer

Official Tencent Cloud community account that brings together developers, shares practical tech insights, and fosters an influential tech exchange community.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.