Generating Hourly Slow Query Reports from MySQL Slow Log Using pt-query-digest and termsql
This guide shows how to generate hourly slow‑query reports from MySQL slow‑log files by using pt‑query‑digest’s timeline feature, filtering with sed, importing the data into SQLite with termsql, and then querying the results to identify hotspot periods and recurring patterns.
Problem: When only a MySQL slow‑log file is available, obtaining an hourly report of slow‑query counts can be cumbersome.
Experiment: Use pt-query-digest --timeline to produce a timeline‑enabled slow‑query report that includes timestamps.
Filter the timeline output with sed to extract the relevant entries.
Install termsql , a lightweight tool that imports plain‑text data into a SQLite database.
Import the filtered timeline report into SQLite via termsql , which stores each line as a record.
Now you can run arbitrary SQL queries against the SQLite database to obtain per‑hour statistics. For example, a query can count slow queries per hour and reveal peak periods.
Using a more complex slow‑log file demonstrates that the method easily highlights hot‑spot time windows and helps discover periodic business patterns.
Benefits of termsql:
1. Quickly exports text data to SQLite for convenient SQL querying.
2. Can be combined with tools like ps or top to, for instance, find processes with CPU usage above 25%.
Overall, this approach provides a simple, scriptable way to turn raw MySQL slow‑log files into actionable hourly reports without needing a full‑blown monitoring system.
Aikesheng Open Source Community
The Aikesheng Open Source Community provides stable, enterprise‑grade MySQL open‑source tools and services, releases a premium open‑source component each year (1024), and continuously operates and maintains them.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.