Implementing Black‑Box Automation Testing Using Android Accessibility Service
This article explains how to build an Android Accessibility service to perform black‑box automation testing, compares its advantages and limitations with UiAutomator1.0/2.0, and provides detailed configuration steps and code examples for service setup, event handling, gesture dispatch, and practical testing scenarios.
Accessibility services on most Android devices can be activated after a reboot, enabling capabilities that standard testing frameworks (UiAutomator 1.0, UiAutomator 2.0) cannot provide; this article describes how to set up such a service for black‑box testing and summarizes the encountered challenges and applicable scenarios.
Advantages of using an Accessibility service for black‑box testing
While the service runs, UiAutomator test cases can be executed; the service is paused during the test and resumes afterward.
On many devices the service starts automatically after a reboot, allowing testing of app auto‑start features.
Because it is an app‑level solution, it can read SMS messages directly, intercept verification codes, and perform login automation.
The service runs with a high priority and strong keep‑alive capability.
Limitations
The service runs in a different process from the app under test, so only black‑box testing is possible.
Permission restrictions on various Android versions prevent access to system‑level information such as recent apps or current activity.
Clickable actions require the target node to be clickable=true ; otherwise performAction fails.
Screen interactions must be performed via gesture APIs; direct screen control is unavailable.
High‑version Android may block launching other apps from the service.
Key events are limited to Home, Back, notification, quick settings, and power dialogs.
If the service crashes, the user must manually re‑enable it in settings.
Service configuration and implementation
Define a custom AccessibilityService and override onServiceConnected to configure the events and flags you need:
AccessibilityServiceInfo accessibilityServiceInfo = getServiceInfo();
if (accessibilityServiceInfo == null) {
accessibilityServiceInfo = new AccessibilityServiceInfo();
setServiceInfo(accessibilityServiceInfo);
}
accessibilityServiceInfo.eventTypes |= AccessibilityEvent.TYPE_VIEW_CLICKED;
accessibilityServiceInfo.eventTypes |= AccessibilityEvent.TYPE_VIEW_SELECTED;
accessibilityServiceInfo.eventTypes |= AccessibilityEvent.TYPE_VIEW_FOCUSED;
accessibilityServiceInfo.eventTypes |= AccessibilityEvent.TYPE_VIEW_TEXT_CHANGED;
accessibilityServiceInfo.flags |= AccessibilityServiceInfo.DEFAULT;
accessibilityServiceInfo.flags |= AccessibilityServiceInfo.FLAG_REQUEST_ENHANCED_WEB_ACCESSIBILITY;
setServiceInfo(accessibilityServiceInfo);Declare the service in AndroidManifest.xml with the required permission and meta‑data pointing to an XML configuration file:
<service
android:label="辅助按键服务"
android:permission="android.permission.BIND_ACCESSIBILITY_SERVICE"
android:exported="true"
android:directBootAware="true"
android:name=".service.TaskAccessibilityService">
<intent-filter>
<action android:name="android.accessibilityservice.AccessibilityService"/>
</intent-filter>
<meta-data
android:name="android.accessibilityservice"
android:resource="@xml/accessibility_service_config"/>
</service>The XML configuration defines properties such as description, feedback type, gesture support, and flags:
<?xml version="1.0" encoding="utf-8"?>
<accessibility-service xmlns:android="http://schemas.android.com/apk/res/android"
android:description="@string/accessibility_service_description"
android:accessibilityFeedbackType="feedbackAllMask"
android:canRetrieveWindowContent="true"
android:canPerformGestures="true"
android:notificationTimeout="100"
android:accessibilityFlags="flagDefault"/>Gesture implementation for click and swipe
For Android < 24, use AccessibilityNodeInfo.performAction with actions like ACTION_CLICK . For Android ≥ 24, calculate the node’s screen bounds and simulate a tap using dispatchGesture :
Path mPath = new Path();
mPath.moveTo(startX, startY);
// For swipe
mPath.lineTo(endX, endY);
dispatchGesture(new GestureDescription.Builder()
.addStroke(new GestureDescription.StrokeDescription(mPath, 50, 500))
.build(), new GestureResultCallback() {
@Override
public void onCompleted(GestureDescription gestureDescription) {
super.onCompleted(gestureDescription);
System.out.println("模拟手势成功");
}
@Override
public void onCancelled(GestureDescription gestureDescription) {
super.onCancelled(gestureDescription);
System.out.println("模拟手势失败");
}
}, null);Utility methods clickNode and clickScreen illustrate how to fallback to parent nodes when the target is not clickable, and how to perform a tap at calculated screen coordinates.
Choosing a testing approach
UiAutomator 1.0 : High permissions via shell, strong keep‑alive, suitable for multi‑app scripts, but not compatible with Android R (11) emulators.
UiAutomator 2.0 : Runs in the same process as the app, lower keep‑alive, best for white‑box testing and complex business logic.
Accessibility service : Requires its own app permission, offers the strongest keep‑alive (survives device reboot), ideal for assistive apps, device maintenance, and initializing test environments.
In summary, using an Android Accessibility service provides a robust way to perform black‑box automation testing across a wide range of scenarios, especially when persistent background execution and cross‑app interaction are required.
360 Tech Engineering
Official tech channel of 360, building the most professional technology aggregation platform for the brand.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.