Why Do MySQL and Java Processes Hit 900% CPU? Proven Diagnosis & Fixes
Learn how to identify and resolve extreme CPU usage spikes—up to 900%—in MySQL and Java applications by using top, show processlist, indexing, caching, thread analysis, and code adjustments, with real‑world examples and step‑by‑step command guides.
Overview
CPU usage exceeding 200% is a common production issue. This article explains why MySQL and Java processes can reach 900% CPU and provides systematic troubleshooting and optimization methods.
Scenario 1: MySQL CPU Spike to 900%
Diagnosis steps:
Run
topto confirm the high‑CPU process is
mysqld.
Execute
show processlistto locate sessions consuming resources.
Identify heavy SQL statements and examine their execution plans for missing indexes or large data scans.
Remediation:
Kill the offending threads and observe CPU reduction.
Add missing indexes, rewrite inefficient SQL, or adjust memory parameters.
Limit excessive connections and use caching (e.g., Redis) to reduce query frequency.
Iteratively apply optimizations and monitor the impact.
Real‑world MySQL case
A production MySQL instance showed CPU usage above 900% due to unindexed queries and an enabled slow‑log that aggravated the load. By disabling the slow‑log, adding the missing index on
user_code, and moving frequent reads to Redis cache, CPU dropped to a stable 70‑80%.
<code>show processlist;</code> <code>select id from user where user_code = 'xxxxx';</code> <code>show index from user;</code>Key takeaways: avoid enabling slow‑log under high load, use
show processlistto pinpoint problematic queries, and combine indexing with caching.
Scenario 2: Java CPU Spike to 900%
Diagnosis steps:
Use
topto find the Java process PID.
Run
top -Hp <PID>to list threads and identify the thread with highest CPU.
Convert the thread ID to hexadecimal and search the thread dump.
Execute
jstackto obtain the stack trace and locate the offending code.
Remediation patterns:
If the thread is in an empty loop or spin‑wait, insert
Thread.sleepor proper locking.
If excessive object creation triggers frequent GC, reduce allocations or use object pools.
For selector busy‑polling, rebuild the selector as shown in Netty source.
Java CPU 700% Optimization Example
A Java service consumed over 700% CPU. The following commands were used to diagnose and fix the issue:
<code>top</code> <code>top -Hp 23602</code> <code>printf "%x\n" 30309</code> <code>jstack -l 29706 > ./jstack_result.txt</code> <code>cat jstack_result.txt | grep -A 100 7665</code>The stack trace pointed to
ImageConverter.run(). The original loop used
poll()on a
BlockingQueue, causing a tight empty‑loop when the queue was empty. Replacing it with
take()blocks until data arrives, eliminating the CPU burn.
<code>while (isRunning) {
try {
byte[] buffer = device.getMinicap().dataQueue.take();
// process buffer
} catch (InterruptedException e) {
e.printStackTrace();
}
}</code>After the change, CPU usage fell below 10% and the service stabilized.
Summary
High CPU usage in MySQL and Java processes can be mitigated by proper monitoring, identifying resource‑heavy queries or threads, adding missing indexes, leveraging caching layers, and fixing inefficient code patterns such as empty loops or excessive object creation.
Efficient Ops
This public account is maintained by Xiaotianguo and friends, regularly publishing widely-read original technical articles. We focus on operations transformation and accompany you throughout your operations career, growing together happily.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.