How Hooop Turns HarmonyOS into an Offline AI Basketball Coach
Hooop leverages HarmonyOS's on‑device AI and custom vision algorithms to provide real‑time, offline basketball training by detecting shots, analyzing trajectories, automatically clipping scoring clips, and tracking performance metrics without an internet connection.
Introduction
Based on HarmonyOS’s on‑device large model and self‑developed visual algorithms, the team built an intelligent basketball training app called Hooop, which participates as a core entry in the 2025 HarmonyOS Innovation Competition.
Goals
Real‑time shooting capture and analysis using the device camera.
Local video import and offline analysis with automatic scoring‑clip generation.
Product Overview
Hooop integrates HarmonyOS native AI capabilities to recognize and analyze basketball shots and results offline, providing users with professional‑grade statistics and training replay even without network access.
Overall Architecture
The app’s main modules include video data acquisition, video clipping, training data statistics, frame inference, and account management.
FFmpeg – video frame extraction and clipping.
OpenCV – frame preprocessing (scaling, padding, transpose).
CANN Kit – hardware‑accelerated inference on HarmonyOS.
Model Zoo – offline object‑detection model (YOLOv8) for basketball and hoop.
Processing Pipeline
Video frame acquisition: capture frames from the camera or read frames from a local video file using FFmpeg.
Frame preprocessing: resize to 640×640, pad, transpose to CHW format, and enqueue for inference.
Frame inference: run the YOLOv8 model to locate the basketball and hoop, then compute whether the trajectory intersects a rectangular hoop region.
Video clipping: automatically extract scoring clips.
Shot data recording: store hit‑rate statistics across weekly, monthly, and career dimensions.
The core logic judges a made shot when the line connecting two consecutive basketball centre points intersects the defined hoop rectangle; otherwise the shot is considered missed.
Performance Optimisation
Initial processing took about 55 ms per frame (≈15 ms preprocessing + ≈40 ms inference), exceeding the 33 ms budget for 30 fps video. Optimisations included:
Reducing frame rate to 20 fps and dropping every third frame, giving a 50 ms budget per frame.
Adopting a producer‑consumer pattern: a preprocessing thread (A) pushes ready frames into a lock‑free queue, while an inference thread (B) pulls frames for model execution, cutting total per‑frame time to ≈40 ms and meeting the 20 fps requirement.
Model Training and Integration
Positive and negative samples of basketball and hoop were collected to train a PyTorch model. Because HarmonyOS cannot run PyTorch or ONNX models directly, the workflow converted the model to an ONNX file and then to an OM offline model using the OMG conversion tool.
Key conversion scripts: convert_cann_compatible.py – converts a PyTorch model to ONNX.
OMG tool – converts the ONNX model to an OM model runnable on HarmonyOS.
Conclusion
Hooop demonstrates how on‑device AI can deliver a fully offline, real‑time basketball training experience on HarmonyOS, while providing reusable technical assets such as CANN‑accelerated inference, FFmpeg‑based video handling, and a multithreaded producer‑consumer architecture for high‑performance frame processing.
Sohu Smart Platform Tech Team
The Sohu News app's technical sharing hub, offering deep tech analyses, the latest industry news, and fun developer anecdotes. Follow us to discover the team's daily joys.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
