GuanYuan Data Tech Team
Jun 30, 2022 · Big Data
Why Spark 3.2 OOMs After Upgrade: Deep Dive into AQE and StageMetrics
After upgrading Spark from 3.0.1 to 3.2.1 an ETL job began failing with OutOfMemory errors; this article examines the root causes, including AQE‑related metric accumulation, skipped stages, and stage‑metric growth, and presents a debugging process and a code‑level fix to mitigate memory pressure.
AQEOutOfMemorySpark
0 likes · 13 min read