Deep Dive into Android Touch Event Processing: From Kernel to View
This article explains how Android transforms raw touch signals from the hardware into MotionEvent objects, routes them through the InputManagerService, InputReader, and InputDispatcher, and finally delivers them to the appropriate View or ViewGroup, detailing the roles of sockets, InputChannel, and the event‑dispatch chain.
Overview
This article explores the complete lifecycle of a touch event on Android, starting from the hardware level, passing through the Linux kernel, the InputManagerService, and ending with the View hierarchy that consumes the event.
1. From Hardware to Kernel
When a user touches the screen, the touch controller detects a voltage change, calculates the X/Y coordinates, and sends the data to the CPU via the I²C bus. The kernel receives the raw data through the /dev/input device files.
2. Input Subsystem (InputReader & InputDispatcher)
The Android input subsystem consists of three layers:
EventHub : Listens to /dev/input using epoll and inotify to detect new events.
InputReader : Reads raw events from EventHub, parses them into KeyEvent or MotionEvent objects, and enqueues them.
InputDispatcher : Takes the parsed events, finds the target window, and forwards the events to the appropriate application.
Key native methods:
// EventHub constructor creates epoll and inotify fds
EventHub::EventHub() { mEpollFd = epoll_create(...); mINotifyFd = inotify_init(); }
// InputReader loop reads events and posts them
void InputReader::loopOnce() { int count = mEventHub->getEvents(); if (count) processEventsLocked(); }
// InputDispatcher notifies the application
void InputDispatcher::notifyMotion(const NotifyMotionArgs* args) {
MotionEntry* entry = new MotionEntry(args);
enqueueInboundEventLocked(entry);
}3. InputManagerService Initialization and Startup
The SystemServer creates an InputManagerService instance, which in its constructor calls nativeInit() (JNI) to create a native NativeInputManager . This native object builds an EventHub , an InputReader , and an InputDispatcher , then starts their threads.
4. Connecting the Application (InputChannel Pair)
When an Activity creates its view, ViewRootImpl#setView() creates an empty InputChannel and registers it with the WindowManagerService. The WindowManagerService creates a paired socketpair() :
sockets[0] – server side, kept by InputDispatcher .
sockets[1] – client side, transferred to the app’s InputChannel .
The server socket is wrapped in a Connection object and stored in InputDispatcher::mConnectionsByFd . The client socket is handed to the app’s InputEventReceiver via InputChannel.transferTo() .
5. Event Flow in the Application Process
The app’s native NativeInputEventReceiver registers the client socket with the Looper. When data arrives, the Looper calls handleEvent() , which creates a MotionEvent (or KeyEvent ) and invokes the Java method dispatchInputEvent() .
abstract class InputEventReceiver {
private void dispatchInputEvent(int seq, InputEvent event) {
onInputEvent(event);
}
}
class ViewRootImpl extends InputEventReceiver {
@Override
public void onInputEvent(InputEvent event) {
enqueueInputEvent(event, this, 0, true);
}
}The event is then processed by ViewRootImpl.enqueueInputEvent() , which forwards it to the appropriate ViewGroup or View via the responsibility‑chain methods dispatchTouchEvent() and onTouchEvent() .
6. ViewGroup Touch Handling
ViewGroup.dispatchTouchEvent() decides whether to:
Consume the event itself (by returning true from onInterceptTouchEvent() ).
Pass the event to a child view (by iterating children in reverse Z‑order and calling dispatchTransformedTouchEvent() ).
Allow a child to request disallowing interception via requestDisallowInterceptTouchEvent(true) , which sets the FLAG_DISALLOW_INTERCEPT flag.
When a child finally handles the event, a TouchTarget object records the consumer so that subsequent MOVE/UP events are routed directly to it.
7. View Touch Consumption
For a leaf View , dispatchTouchEvent() simply forwards the event to onTouchEvent() . Subclasses override onTouchEvent() to react to MotionEvent.ACTION_DOWN , MOVE , and UP .
Conclusion
The Android touch pipeline is a tightly coupled chain that starts with hardware, passes through the Linux input subsystem, traverses the native InputManagerService, and finally reaches the Java UI framework where Views and ViewGroups decide how to handle the event. Understanding each layer—EventHub, InputReader, InputDispatcher, InputChannel, and the View hierarchy—provides a solid foundation for debugging input‑related issues or building custom input handling logic.
Sohu Tech Products
A knowledge-sharing platform for Sohu's technology products. As a leading Chinese internet brand with media, video, search, and gaming services and over 700 million users, Sohu continuously drives tech innovation and practice. We’ll share practical insights and tech news here.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.