AMAP-TECH Algorithm Competition: Dynamic Road‑Condition Analysis from In‑Vehicle Video Images
Alibaba Amap’s AMAP‑TECH competition invites participants to develop AI computer‑vision models that classify real‑time road conditions—smooth, slow, or congested—from short sequences of dash‑cam images, using a labeled dataset of 1,500 training sequences and a weighted F1‑score evaluation, with cash prizes up to ¥60,000.
Alibaba Amap (AMAP‑TECH) launches an algorithm competition on July 8. The challenge is to develop AI‑driven computer‑vision methods that infer real‑time road‑condition status (smooth, slow, congested) from short sequences of in‑vehicle video frames.
Background
Gaode Map provides massive positioning and navigation services. Traditional traffic‑state estimation relies on vehicle trajectories, which become unreliable on low‑traffic or abnormal driving patterns. Video frames captured by dashcams contain richer visual cues (vehicle count, road width, occupancy) that can improve traffic‑state accuracy.
Problem Definition
Given a sequence of 3‑5 images with GPS timestamps, where one frame is designated as the reference frame, participants must output the traffic‑state label (smooth, slow, congested) for the entire sequence, using the reference frame as the ground‑truth reference.
Data Description
The dataset is split into a preliminary (pre‑competition) set and a final (post‑preliminary) set. The preliminary set contains 1 500 sequences (≈ 7 000 images) for training and 600 sequences (≈ 2 800 images) for testing. Road‑state distribution: smooth 70 %, slow 10 %, congested 20 %. Labels are primarily derived from the reference frame, with additional context from surrounding frames when vehicle density is high.
Data Format
Each sequence is stored in a folder containing the reference frame and its neighboring frames (max 5 frames). Annotation files are provided in JSON, e.g.:
{
"annotations": [
{
"id": "000001",
"key_frame": "2.jpg",
"status": 0,
"frames": [
{"frame_name": "1.jpg", "gps_time": 1552806921},
{"frame_name": "2.jpg", "gps_time": 1552806926},
{"frame_name": "3.jpg", "gps_time": 1552806931},
{"frame_name": "4.jpg", "gps_time": 1552806936}
]
},
{
"id": "000002",
"key_frame": "3.jpg",
"status": 2,
"frames": [
{"frame_name": "1.jpg", "gps_time": 1555300555},
{"frame_name": "2.jpg", "gps_time": 1555300560},
{"frame_name": "3.jpg", "gps_time": 1555300565},
{"frame_name": "4.jpg", "gps_time": 1555300570},
{"frame_name": "5.jpg", "gps_time": 1555300580}
]
}
]
}Evaluation Metric
Submissions are scored with a weighted F1‑Score over the three traffic‑state classes. Higher F1 indicates better classification performance.
Expert Judges
The jury includes professors and senior researchers from Peking University, the Chinese Academy of Sciences, and Alibaba’s Amap technical committee, ensuring a rigorous scientific review.
Schedule & Participation
Registration: July 8 – August 28 (UTC+8)
Preliminary round: July 8 – August 31
Semifinal round: September 4 – October 13
Final round: Late October (date TBD)
The dataset becomes publicly downloadable on July 8. After 10:00 AM on July 20, participants may submit JSON results for the test set.
Open to individuals, universities, research institutes, enterprises, and maker teams. Teams may consist of up to three members.
Prizes
Champion: ¥60 000 + certificate
Runner‑up: ¥40 000 + certificate
Third place: ¥20 000 + certificate
Honorable mentions (2 teams): ¥10 000 each + certificate
Top‑10 semifinal teams may receive fast‑track recruitment opportunities with Alibaba Amap.
Communication
Participants can join the DingTalk group (ID 31160357) for announcements and Q&A. QR codes are provided in the original announcement.
Amap Tech
Official Amap technology account showcasing all of Amap's technical innovations.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.