Flutter UI Gesture Recording and Replay Technique
The article presents a full system for capturing and replaying Flutter UI gestures, detailing how raw pointer data is processed, how gesture recognizers compete, how recordings wrap callbacks and generate synthetic touch packets, and how a timer‑driven replay reproduces user interactions for debugging.
After an app is released, developers often struggle to reproduce and locate user‑side issues. The article proposes a complete system for capturing and replaying Flutter UI gestures to solve this problem.
It first explains the fundamentals of Flutter's gesture handling: raw pointer data (PointerDataPacket) is received from the native side, converted to logical pixels, and then dispatched to the appropriate RenderObject via hit testing. Multiple gesture recognizers compete in a "gesture arena" where any recognizer can declare victory or defeat, determining which one finally handles the event.
The recording process wraps the callback of each gesture recognizer, captures the view hierarchy (WidgetsFlutterBinding → A → C → K → G), the gesture type, and the touch coordinates. Example code for tap recording is shown:
static GestureTapCallback onTapWithRecord(GestureTapCallback orgOnTap, BuildContext context) { ... }For scroll gestures, the article details how to generate a series of synthetic touch points. It calculates the required scroll offset, determines the number of touch events, and creates a list of PointerDataPacket objects where the first point is a down event and the rest are move events. The generated data is then sent to the Flutter engine at a fixed frame interval using a Timer .
List
createTouchDataList(int count, double unit, double physicalY, double physicalX) { ... }During replay, a periodic timer checks the current scroll position, compares it with the target position, and sends the prepared touch packets via ui.window.onPointerDataPacket . When the target is reached, an up packet is sent and the timer is cancelled.
The overall framework combines native and Flutter layers, linking UI events with recorded data, and provides a high‑level flow: capture → store → match view hierarchy → synthesize touch data → replay.
In conclusion, the solution covers four parts: Flutter gesture fundamentals, UI gesture recording, UI gesture replay, and the complete system architecture. Future work includes making replayed touches more realistic (e.g., adding acceleration curves) and addressing inconsistencies between recording and replay.
Xianyu Technology
Official account of the Xianyu technology team
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.