Operations 11 min read

How Information Entropy Powers AI‑Driven Alert Noise Reduction in Cloud‑Native Operations

This article explains how Shannon's information entropy and NLP are combined in Alibaba Cloud's ARMS intelligent noise reduction to quantify alert uncertainty, filter redundant notifications, and automatically prioritize critical incidents, offering a practical, self‑learning solution for modern monitoring environments.

Alibaba Cloud Native
Alibaba Cloud Native
Alibaba Cloud Native
How Information Entropy Powers AI‑Driven Alert Noise Reduction in Cloud‑Native Operations

Information Entropy Background

Claude Shannon (1948) defined information entropy as a quantitative measure of the uncertainty of a source, analogous to thermodynamic entropy. For a discrete event with probability p, the entropy is H = -p·log₂(p). In operations, an alert can be treated as an information item; the more predictable the alert, the lower its entropy.

Alert Noise Problem in Modern Monitoring

Large‑scale monitoring platforms generate millions of alerts daily. Multiple tools, overlapping thresholds, and custom rules produce duplicate or low‑value alerts, creating alert storms that hide critical incidents.

ARMS Intelligent Noise Reduction

The ARMS ITSM service combines natural‑language processing (NLP) with Shannon entropy to assign an entropy score to each incoming alert. Alerts with low entropy are classified as noise; alerts with high entropy are highlighted for rapid response.

Model Construction

Text Vectorization – Tokenize the alert title and body using a domain‑specific vocabulary, then convert tokens into word vectors (e.g., TF‑IDF or embedding vectors).

Entropy‑Weighted Importance – For each token compute its probability p(token) from the historical corpus, calculate its entropy H(token) = -p(token)·log₂(p(token)), and weight it by TF‑IDF. Aggregate token entropies to obtain an event‑level entropy.

Normalization – Apply a sigmoid function S(x) = 1 / (1 + e^{-x}) to map raw entropy to a 0‑1 score, simplifying threshold comparison.

Iterative Training – Use historical handling records (e.g., escalation decisions) as labels to train a binary classifier that predicts “important” vs. “noise”. Retrain weekly on the latest month of data to keep the model up‑to‑date.

Key Parameters

Noise‑Event Threshold – Entropy score below which an alert is treated as noise.

Priority Keywords – User‑defined terms (e.g., "critical", "urgent") that increase an alert’s score.

Blacklist Keywords – Terms (e.g., "test", "demo") that force the alert’s entropy to zero.

Business Value

Duplicate‑Event Suppression – Repeated alerts receive progressively lower entropy, eventually approaching zero, allowing automatic suppression.

Novel‑Event Discovery – Rare or previously unseen alerts retain high entropy, drawing operator attention.

Customizable Prioritization – Operators can adjust priority/blacklist keywords and the entropy threshold to match operational policies.

Self‑Learning Growth – Weekly retraining adapts the model to evolving alert patterns without manual intervention, even for users with limited historical data.

Best‑Practice Workflow

Open the ARMS console and navigate to the “Intelligent Noise Reduction” feature.

Enable the feature when alert volume exceeds a manageable level.

The system extracts up to one month of historical alerts (sampling if necessary) and trains the entropy model.

After training, view the detail page to monitor per‑alert entropy scores.

Configure priority and blacklist keywords, and set the noise‑event threshold to separate noise from actionable alerts.

Glossary

Noise‑Event – Alert whose entropy is below the configured threshold.

Non‑Noise Event – Alert whose entropy is equal to or above the threshold.

Priority Word – Keyword that raises an alert’s score.

Blacklist Word – Keyword that forces an alert’s entropy to zero.

Top‑50 Common Words – Most frequent tokens extracted from the historical corpus, displayed for reference.

Illustrative UI Screenshots

Model architecture diagram
Model architecture diagram
Workflow step 0 – entry
Workflow step 0 – entry
Enable feature
Enable feature
Training progress
Training progress
Keyword configuration
Keyword configuration
MonitoringNLPinformation entropyAlert Noise Reduction
Alibaba Cloud Native
Written by

Alibaba Cloud Native

We publish cloud-native tech news, curate in-depth content, host regular events and live streams, and share Alibaba product and user case studies. Join us to explore and share the cloud-native insights you need.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.