Fundamentals 24 min read

Unlock Faster C++ Apps: Master Asynchronous Programming with std::async and Promise

This article explains the drawbacks of synchronous C++ code, introduces asynchronous programming concepts, and walks through core C++ async tools such as std::async, std::future, std::promise, and std::packaged_task with practical code examples and performance‑focused case studies.

Deepin Linux
Deepin Linux
Deepin Linux
Unlock Faster C++ Apps: Master Asynchronous Programming with std::async and Promise

When you save a document in an office application and the interface freezes while a progress bar crawls, the problem is caused by blocking synchronous code. Synchronous programming executes tasks one after another, so time‑consuming operations like file I/O or network requests stall the whole program and waste CPU cycles.

Asynchronous programming, by contrast, works like a multi‑lane highway: when a long‑running operation is encountered, the program can immediately switch to other work, keeping the CPU busy and dramatically improving responsiveness and user experience.

Part1 Asynchronous Programming Introduction

1.1 What Is Asynchrony?

Asynchronous programming is a paradigm that lets a program continue executing other tasks while waiting for an operation to finish, instead of blocking; it is the opposite of synchronous (sequential) execution.

In traditional single‑threaded code, execution follows a strict order; asynchronous code breaks this order, allowing tasks to run independently and often more efficiently.

In simple terms, synchronous code runs step‑by‑step, while asynchronous code can run out of order, yielding higher performance.

Common asynchronous techniques include callback functions and asynchronous Ajax.

(1) Callback Functions

The most common example is

setTimeout

:

<script type="text/javascript">
	setTimeout(function() {
		console.log("First")
	}, 2000)
	console.log("Second")
</script>

Normally the output would be "First" then "Second", but because of the 2‑second delay, "Second" appears first and "First" appears after the delay.

(2) Asynchronous Ajax

<button>Send an HTTP GET request and get the result</button>
<script>
	$(document).ready(function() {
		$("button").click(function() {
			$.get("data.json", function(data, status) {
				console.log("Data: " + data + "
Status: " + status);
			});
			console.log("1111")
		});
	});
</script>

1.2 Synchronous vs Asynchronous: The Tortoise and the Hare

Synchronous programming is like a steady tortoise: each task must finish before the next begins, so any long‑running operation blocks the whole program.

Asynchronous programming is like a swift hare: when a time‑consuming task appears, the program hands it off and continues with other work, receiving the result later via callbacks, promises, or other mechanisms.

1.3 Why Asynchrony Matters in C++

C++ is widely used in system‑level development, game development, embedded systems, and high‑performance computing, where blocking operations can severely degrade performance.

In operating systems, asynchronous I/O lets the system handle other tasks while waiting for disk or network operations, improving responsiveness and throughput.

In games, asynchronous tasks keep rendering and input handling smooth, preventing frame drops.

In HPC, async techniques exploit multiple cores to run independent calculations in parallel, shortening overall execution time.

Part2 Exploring C++ Asynchronous Tools

Now that the importance of asynchrony is clear, let’s examine the powerful C++ facilities for implementing it.

2.1 std::async: Your Asynchronous Workhorse

std::async

launches a function asynchronously and returns a

std::future

to retrieve its result. Basic syntax:

std::future<ReturnType> future = std::async(launch_policy, function_name, arg1, arg2, ...);

The optional launch policy can be:

std::launch::async

– run the task on a new thread immediately.

std::launch::deferred

– defer execution until

future.get()

or

future.wait()

is called.

std::launch::async | std::launch::deferred

– the default; the implementation decides.

Example:

#include <iostream>
#include <future>
#include <chrono>

// Simulate a heavy task
int heavyTask() {
    std::this_thread::sleep_for(std::chrono::seconds(2));
    return 42;
}

int main() {
    std::future<int> futureResult = std::async(std::launch::async, heavyTask);
    std::cout << "Doing other things while waiting for the task..." << std::endl;
    int result = futureResult.get(); // blocks only if the task isn’t finished
    std::cout << "The result of the heavy task is: " << result << std::endl;
    return 0;
}

2.2 std::future: Accessing Asynchronous Results

std::future

provides a window to the state of an asynchronous operation.

get() blocks until the result is ready and returns it (or re‑throws any exception).

std::future<int> future = std::async([](){ return 1 + 2; });
int result = future.get();
std::cout << "Result: " << result << std::endl;

wait() blocks until the task finishes but discards the result.

std::future<void> future = std::async([](){ std::this_thread::sleep_for(std::chrono::seconds(3)); });
std::cout << "Waiting for the task..." << std::endl;
future.wait();
std::cout << "Task finished." << std::endl;

wait_for() blocks for a specified duration and returns a

std::future_status

indicating

ready

,

timeout

, or

deferred

.

std::future<int> future = std::async([](){ std::this_thread::sleep_for(std::chrono::seconds(2)); return 42; });
std::this_thread::sleep_for(std::chrono::seconds(1));
auto status = future.wait_for(std::chrono::seconds(1));
if (status == std::future_status::ready) {
    int result = future.get();
    std::cout << "Task completed, result: " << result << std::endl;
} else if (status == std::future_status::timeout) {
    std::cout << "Task is still in progress." << std::endl;
} else {
    std::cout << "Task is deferred." << std::endl;
}

wait_until() works like

wait_for

but waits until a specific time point.

2.3 std::promise: Bridging Threads

std::promise

pairs with

std::future

to transfer a value from one thread to another. The producing thread calls

set_value()

(or

set_exception()

), while the consuming thread retrieves it via

future.get()

.

#include <iostream>
#include <thread>
#include <future>

void setPromiseValue(std::promise<int>& promise) {
    std::this_thread::sleep_for(std::chrono::seconds(2));
    promise.set_value(42);
}

int main() {
    std::promise<int> promise;
    std::future<int> future = promise.get_future();
    std::thread t(setPromiseValue, std::ref(promise));
    std::cout << "Waiting for the result..." << std::endl;
    int result = future.get();
    std::cout << "The result is: " << result << std::endl;
    t.join();
    return 0;
}

2.4 std::packaged_task: Encapsulating Callables

std::packaged_task

wraps a callable object so it can be executed asynchronously, exposing a

std::future

for the result.

#include <iostream>
#include <future>
#include <thread>

int add(int a, int b) { return a + b; }

int main() {
    std::packaged_task<int(int,int)> task(add);
    std::future<int> futureResult = task.get_future();
    std::thread t(std::move(task), 3, 5);
    std::cout << "Doing other things while the task runs..." << std::endl;
    int result = futureResult.get();
    std::cout << "The result of the addition is: " << result << std::endl;
    t.join();
    return 0;
}

Part3 C++ Asynchronous Programming Case Studies

3.1 Case 1: Asynchronous File Reading

Reading a large file synchronously blocks the thread. An asynchronous version uses

std::async

to read the file in a separate task, allowing the main thread to continue working.

#include <iostream>
#include <fstream>
#include <future>
#include <chrono>

std::string readFileAsync(const std::string& filename) {
    std::ifstream file(filename);
    std::string content;
    file.seekg(0, std::ios::end);
    content.resize(file.tellg());
    file.seekg(0, std::ios::beg);
    file.read(&content[0], content.size());
    return content;
}

int main() {
    auto start = std::chrono::high_resolution_clock::now();
    std::future<std::string> futureContent = std::async(std::launch::async, readFileAsync, "large_file.txt");
    std::cout << "Doing other things while reading the file..." << std::endl;
    std::string content = futureContent.get();
    auto end = std::chrono::high_resolution_clock::now();
    auto duration = std::chrono::duration_cast<std::chrono::milliseconds>(end - start).count();
    std::cout << "Asynchronous read time: " << duration << " ms" << std::endl;
    return 0;
}

Although I/O time dominates, the asynchronous approach keeps the UI responsive, preventing freezes during large file reads.

3.2 Case 2: Multi‑Threaded Data Processing

Processing 10,000 integers by squaring each element can be parallelized across multiple threads. The example shows a single‑threaded version and a multi‑threaded version that uses a mutex to protect shared writes.

#include <iostream>
#include <vector>
#include <thread>
#include <mutex>
#include <chrono>

std::mutex mutex_;

void squareArrayMultiThread(std::vector<int>& input, std::vector<int>& output, int start, int end) {
    for (int i = start; i < end; ++i) {
        std::lock_guard<std::mutex> lock(mutex_);
        output[i] = input[i] * input[i];
    }
}

int main() {
    std::vector<int> input(10000), output(10000);
    for (int i = 0; i < 10000; ++i) input[i] = i + 1;
    auto start = std::chrono::high_resolution_clock::now();
    const int numThreads = 4;
    std::vector<std::thread> threads;
    int chunkSize = input.size() / numThreads;
    for (int i = 0; i < numThreads; ++i) {
        int startIdx = i * chunkSize;
        int endIdx = (i == numThreads - 1) ? input.size() : (i + 1) * chunkSize;
        threads.emplace_back(squareArrayMultiThread, std::ref(input), std::ref(output), startIdx, endIdx);
    }
    for (auto& t : threads) t.join();
    auto end = std::chrono::high_resolution_clock::now();
    auto duration = std::chrono::duration_cast<std::chrono::milliseconds>(end - start).count();
    std::cout << "Multi‑thread processing time: " << duration << " ms" << std::endl;
    return 0;
}

Tests show multi‑threaded processing can significantly reduce execution time for large data sets, though thread‑creation overhead may outweigh benefits for small workloads.

concurrencyC++Multithreadingasynchronous programmingstd::async
Deepin Linux
Written by

Deepin Linux

Research areas: Windows & Linux platforms, C/C++ backend development, embedded systems and Linux kernel, etc.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.