Why Integrated Communication, Sensing, Computing, and AI Are the Key to 6G
This article reviews the evolution from 1G to 5G, explains the ITU’s 6G recommendations, and analyzes how deep integration of communication, sensing, computing, and artificial intelligence—through compute‑endogenous networks, AI‑native RAN, ISAC, and autonomous RAN—will shape the next generation of mobile networks.
Introduction
6G is positioned as the next evolutionary step of mobile communications, extending the capabilities of 5G by tightly integrating communication, sensing, computing, and artificial intelligence (AI). The article reviews the historical progression from 1G to 5G to illustrate how each generation added new services (voice, SMS, broadband, massive IoT) and set the stage for the more demanding 6G use cases.
Evolution of Mobile Generations
1G (1980s) : Analog cellular voice.
2G (1990s) : Digital voice, SMS, MMS.
3G (early 2000s) : Mobile internet, email, streaming.
4G (late 2000s) : Mobile broadband, low latency, high‑capacity video and gaming.
5G (2019) : Gigabit‑plus speeds, ultra‑low latency, massive device connectivity, enabling IoT, smart factories, autonomous vehicles.
These milestones demonstrate a steady increase in data rate, latency performance, and service diversity, motivating a more radical redesign for 6G.
ITU 6G Recommendations and Global Research Landscape
In September 2024 the 3GPP SA‑105 meeting approved the first 6G standardisation project (scenario, requirements, key capabilities). ITU‑R 2023 (M.2160) defines seven overarching goals for 6G – inclusivity, ubiquitous connectivity, sustainability, innovation, security, standardisation, and interoperability – and identifies nine application trends (immersive media, digital twins, AI‑driven services, etc.). Six representative 6G scenarios are highlighted:
Immersive communication
Ultra‑massive connectivity
Ultra‑reliable low‑latency communication
AI‑communication convergence
Sensing‑communication convergence
Ubiquitous connectivity
Major economies have launched dedicated 6G programmes (e.g., China’s IMT‑2030 group, the US Next G alliance, EU Hexa‑X‑II, Korea’s 6G plan, Japan’s 6G strategy), confirming a worldwide push toward standardisation and prototype development.
Why Integrated Communication‑Sensing‑Computing‑Intelligence Is Essential
Current 5G extensions (MEC, NWDA, RIC) treat new functions as add‑on modules, which leads to:
Low overall resource utilisation (CPU/GPU/ASIC/FPGAs remain under‑used).
Higher energy consumption.
Inability to jointly optimise heterogeneous service‑level requirements.
A unified design that embeds communication, sensing, computing, and AI from the ground up is required to meet the diverse performance targets of future services.
Key Enabling Technologies
1. Compute‑Endogenous Networks
Compute‑endogenous networking virtualises the baseband unit (BBU) compute resources into three pools:
General‑purpose pool – for standard data‑plane processing.
Communication‑specific pool – for PHY/MAC functions.
AI‑specific pool – for inference and training workloads.
Idle compute across BBUs is aggregated into a shared resource pool. An intelligent scheduler predicts the “tidal effect” of BBU load using time‑series forecasting and allocates resources to non‑communication AI tasks when surplus capacity is detected. This approach improves utilisation, reduces latency for AI services, and enables on‑demand edge AI.
2. AI‑Native RAN Design
AI‑Native RAN extends the traditional base‑station with built‑in AI services that support both model training and inference:
On‑demand compute and storage are provisioned for AI model training at the edge.
A data‑marketplace allows users to acquire curated datasets directly from the RAN.
AI‑as‑a‑Service (AIaaS) exposes trained models for low‑latency inference within the radio access network, enabling applications such as real‑time conversational agents or vision‑based control.
The architecture follows a cloud‑native paradigm (containers, micro‑services) and leverages the O‑RAN open interfaces (A1, O1, O2) to expose AI capabilities to external orchestration platforms.
3. Integrated Sensing and Communication (ISAC)
ISAC merges radar‑type environmental sensing with the communication waveform, delivering two major benefits:
Improved spectrum efficiency by re‑using the same carrier for both data transmission and sensing.
New service categories (e.g., simultaneous positioning, object detection, and data exchange).
Technical challenges include:
Co‑existence and interference management between sensing and communication functions.
Full‑duplex operation and self‑interference cancellation.
Design of joint waveforms (e.g., OFDM‑radar, multi‑subcarrier chirps) that satisfy both communication reliability and sensing resolution.
Hardware augmentation such as dedicated sensing RF chains or re‑configurable antenna arrays.
AI‑enhanced signal processing pipelines for tasks like object classification and activity recognition.
4. RAN Autonomy (O‑RAN‑Based Closed‑Loop Control)
RAN autonomy builds on the O‑RAN architecture, using open APIs to integrate centralized AI decision modules and near‑real‑time controllers (rAPPs/xAPPs). The closed‑loop workflow consists of:
Data collection from distributed radio units via the O1 interface.
Policy inference and optimisation performed by AI models accessed through the A1 interface.
Real‑time actuation of radio parameters (e.g., scheduling, beamforming) via the O2 interface.
This enables dynamic resource orchestration, plug‑and‑play AI applications, and a reduction in OPEX through automated network optimisation.
Detailed Technical Description of Core Concepts
Compute‑Endogenous Architecture
The BBU virtualisation layer abstracts physical compute (CPU, GPU, ASIC) into logical pools. A resource‑state database tracks utilisation per pool and per time slot. The scheduler runs a two‑stage algorithm:
Prediction stage : Apply a recurrent neural network (e.g., LSTM) to forecast BBU load for the next scheduling window.
Allocation stage : Solve a mixed‑integer linear program that maximises the weighted sum of communication QoS and AI task throughput while respecting latency and power budgets.
Resulting allocations are pushed to the BBUs via the O‑RAN O2 interface, enabling seamless hand‑off between communication and AI workloads.
AI‑Native RAN Service Flow
# Example pseudo‑code for AI model registration in an AI‑Native RAN
register_model(model_id, model_blob, required_compute="GPU", data_source="marketplace")
allocate_resources(model_id)
while True:
data = fetch_input()
result = infer(model_id, data)
send_result_to_application(result)This flow illustrates how a model is registered, resources are allocated from the compute‑endogenous pool, and inference is performed locally at the edge.
ISAC Waveform Co‑Design
A practical ISAC implementation uses an OFDM carrier with a subset of sub‑carriers dedicated to chirp‑based ranging. The transmitter emits a composite signal:
tx_signal = ofdm_data + chirp_preamble
# Receiver separates data and sensing streams
rx_data = ofdm_demodulate(tx_signal)
rx_sensing = chirp_correlate(tx_signal)AI‑driven post‑processing (e.g., convolutional neural networks) refines the raw range‑Doppler map to detect objects with sub‑meter accuracy.
RAN Autonomy Control Loop
# Simplified closed‑loop control pseudo‑code
while True:
metrics = collect_radio_metrics()
policy = ai_policy_inference(metrics) # via A1 interface
apply_policy(policy) # via O2 interface
sleep(control_interval)The loop runs at sub‑second intervals, allowing the network to adapt to traffic spikes, interference, or sensing‑driven events.
Outlook
By 2030, 6G networks are expected to embed AI/ML at every layer, manage massive data streams through coordinated edge‑cloud resources, and provide native sensing capabilities. The convergence of compute‑endogenous architectures, AI‑native RAN, ISAC, and autonomous RAN will deliver the performance, flexibility, and sustainability required for immersive media, digital twins, autonomous systems, and other emerging 6G services.
AsiaInfo Technology: New Tech Exploration
AsiaInfo's cutting‑edge ICT viewpoints and industry insights, featuring its latest technology and product case studies.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
