Technical Overview of Augmented Reality Head‑Up Display (ARHUD) for Automotive Navigation
ARHUD integrates augmented‑reality graphics into a vehicle’s head‑up display, projecting speed limits, lane cues and navigation data at a virtual distance beyond 10 m and a wide field of view, while requiring precise eye‑point tracking, coordinate conversion, and advanced optics such as TFT‑LCD, DLP or LCOS to overcome challenges of brightness, size, alignment and limited FOV for safe, low‑cognitive‑load driving.
ARHUD (Augmented Reality Head‑Up Display) combines augmented reality with traditional HUD technology to project rendered elements onto the real world, offering a low‑cognitive‑load way for drivers to receive information.
The HUD concept originated in World War II for guns and fighter aircraft, shifted to civilian use in the early 1980s, and was formally defined in the early 1990s, eventually becoming a feature in automobiles.
ARHUD navigation projects speed limits, steering cues, lane‑guidance lines, and other navigation data directly in the driver’s forward line of sight, allowing the driver to keep the head upright and eyes on the road.
In August 2022, Amap (Gaode) partnered with BAIC and Huawei to launch the BAIC Magic Cube ARHUD navigation system, leveraging extensive R&D and industry‑leading experience.
1. Virtual Image Distance (VID)
VID is the perceived distance from the virtual image to the driver’s eye. Human eyes have different focal lengths for near and far objects; if VID is too short, the ARHUD image becomes blurry when the driver looks at distant objects.
Traditional HUDs have a VID of about 2.5 m, whereas ARHUDs aim for >10 m, with some lane‑change displays requiring projection distances of up to 20 m.
W‑HUD works like a projector that reflects the image onto the windshield, but its limited image size (typically 15‑20 in at ~3 m distance) and lack of road‑fusion force the driver to shift focus, contradicting the HUD’s original intent.
2. Field Of View (FOV)
FOV is the angular span of the display centered on the driver’s eye, including horizontal and vertical components. Conventional HUDs offer a narrow ~5° FOV, while ARHUDs target >10° horizontally, with some models reaching 13°–20°.
3. Eye Point
The eye point (x, y, z) is defined in the vehicle coordinate system with the vehicle front center as origin, measured in meters. It varies with driver height, seating posture, and head position.
4. Virtual Image Rotation (Three Degrees of Freedom)
Rotation angles around the X, Y, and Z axes (LDA/down‑view, roll, and yaw) define the orientation of the virtual image.
5. Virtual Image ↔ World Coordinate Conversion
World‑to‑pixel conversion for camera projection is first shown, followed by world‑to‑virtual‑image conversion (also in pixels). Knowing VID, FOV, eye point, and virtual‑image angles enables bidirectional conversion between world coordinates and virtual‑image coordinates.
The relationship between camera focal length and HUD virtual‑image distance is highlighted; insufficient VID forces the driver to refocus, undermining HUD benefits.
6. Applications of Coordinate Conversion
6.1 Validation of virtual‑image projection accuracy – by mapping representative pixel points (typically nine) to world coordinates and checking alignment.
6.2 Preventing lane‑guidance lines from exceeding the virtual‑image display area – by converting lane‑change information into world coordinates via selected virtual‑image pixels, ensuring the projected lines stay within view.
7. ARHUD Hardware Technologies
7.1 TFT‑LCD – LED light passes through liquid‑crystal cells; mature, low‑cost, but suffers from sunlight wash‑out and limited brightness.
7.2 DLP (Digital Light Processing) – uses TI’s DMD chip; offers vivid colors and fine detail, but is expensive (>¥5,000) and can exhibit rainbow artifacts.
7.3 LCOS (Liquid Crystal on Silicon) – reflective silicon‑based LCD; high optical efficiency and controllable cost, but currently limited in volume production and has a relatively large optical module.
8. Main Technical Challenges of ARHUD
Insufficient FOV – current devices cover only a small portion of the driver’s visual field.
Projection brightness – must adapt to varying ambient light and weather conditions.
Hardware volume – larger optics needed for wide FOV increase system size, conflicting with vehicle packaging.
Real‑world alignment – requires real‑time correction using map data, sensors, and GPS to match AR graphics with the actual road.
Dynamic eye‑point tracking – essential to avoid image blur or misalignment as the driver moves.
In conclusion, ARHUD technology has become a competitive frontier in automotive navigation. Future developments, including potential contributions from companies like Apple, will need to overcome these challenges to achieve true “what‑you‑see‑is‑what‑you‑get” navigation experiences.
Amap Tech
Official Amap technology account showcasing all of Amap's technical innovations.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.