How We Resolved Deep Pagination Slowness in MySQL: A Step‑by‑Step Case Study
This article walks through a real‑world MySQL deep‑pagination incident, detailing the timeline, root‑cause analysis, and multiple optimization techniques—including ID‑based queries, sub‑queries, and cursor pagination—that ultimately stabilized the system and eliminated slow‑SQL overload.
Incident Background
This incident is a typical example of a pagination slow‑query problem we encountered in our team. By following this article you can clearly reconstruct the incident scene and see each adjustment and change we made.
Problem Scene
16:00 Received colleague feedback that the fusion system's paginated query availability dropped.
16:05 Checked the UMP monitoring of the query interface and found the TP99 latency spiked abnormally.
Machine monitoring showed that TP999 on almost all machines was unusually high. CPU usage was not high.
Further monitoring revealed that database CPU usage surged, and a large number of slow‑SQL emails were received.
We determined that the issue would not be solved in minutes, so we took two steps: first, enable rate limiting to prevent more slow‑SQL requests; second, analyze and refactor the slow SQL.
16:15 Log inspection showed most SQL pointed to merchant xxxx, which had 100k rows (10% of total). MQ showed many retries, and the pagination interface timeout was set to 2 s, indicating high‑frequency retries were dragging down database performance.
16:25 Code review identified a deep‑pagination problem and proposed an optimization: first query IDs, then use a range query instead of offset‑based LIMIT.
17:40 After deploying the optimized code, MQ consumption was resumed, but accumulated messages still pressured the database, causing TP99 to spike again and CPU to hit 100%.
18:00 Decided to stop optimizing the old interface and develop a new one based on scrolling IDs, involving upstream collaboration.
22:20 The new interface went live, MQ consumption was reopened, and despite upstream backlog the new interface performed steadily, solving the problem.
Root Cause and Solutions
Cause of Deep Pagination
Problem SQL example:
select * from table where org_code = xxxx limit 1000,100MySQL's LIMIT works by first reading the first 1000 rows, discarding them, then returning the next 100 rows. Larger offsets lead to worse performance.
Various Solutions for Deep Pagination
Query ID + ID‑based Query
First query the IDs, then use them in a range query. Example:
select id from table where org_code = xxxx limit 1000,5Then fetch rows by ID:
select * from table where id in (1,2,3,4,5);ID‑based Query Optimization
After obtaining IDs, use a range condition instead of IN for higher efficiency:
select * from table where org_code = xxxx and id >= 1 and id <= 5;Using sub‑queries can also reduce I/O:
select a.id,a.dj_sku_id,a.jd_sku_id from table a join (select id from jd_spu_sku where org_code = xxxx limit 1000,5) b on a.id = b.id;Using Cursor‑based Pagination
Each request returns the maximum ID (cursor). The next request supplies this cursor, and the server returns the next N rows where ID > cursor.
select * from table where org_code = xxxx and id > 0 limit 10;After testing, this method performed well and was adopted.
Final Solution
Join‑based Approach
We first query IDs (without leaf‑node fields to keep the index), then join to fetch the full rows:
select a.id AS id, a.dj_org_code AS djOrgCode, a.dj_sku_id AS djSkuId, a.jd_sku_id AS jdSkuId, a.yn AS yn
from table a join (
SELECT id FROM table where org_code = xxxx and id > 0 order by id asc limit 500
) t on a.id = t.id;EXPLAIN confirmed the intended index was used, and the deployment resolved the issue.
Summary of Issues
B‑side systems must protect themselves with rate limiting to prevent abnormal traffic from crashing the service.
Potential risk points should never be ignored; the pagination interface had previously passed performance checks but later failed under larger data volumes.
SQL optimizations require thorough testing across edge cases before deployment.
Java Interview Crash Guide
Dedicated to sharing Java interview Q&A; follow and reply "java" to receive a free premium Java interview guide.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
