Backend Development 8 min read

Investigation and Resolution of Full GC Alerts Caused by Groovy Script Execution in a Java Service

The report analyzes recurring morning Full GC alerts in a Java service, identifies a Groovy‑driven scheduled task as the root cause through JVM parameter review, log inspection, and memory profiling, and proposes concrete fixes such as reusing a single GroovyShell instance and clearing its classloader cache.

Zhuanzhuan Tech
Zhuanzhuan Tech
Zhuanzhuan Tech
Investigation and Resolution of Full GC Alerts Caused by Groovy Script Execution in a Java Service

Problem Background

After a Prometheus monitoring alarm fired, a service generated Full GC alerts every day between 08:00 and 12:00. The task was to investigate and resolve the issue.

Analysis Process

2.1 Parameter Configuration

The JVM was started with the following options:

-Xms3g -Xmx3g -Xmn1g -XX:MetaspaceSize=128m -XX:ParallelGCThreads=5 -XX:+UseConcMarkSweepGC -XX:+UseParNewGC -XX:+UseCMSCompactAtFullCollection -XX:CMSInitiatingOccupancyFraction=80 -XX:+PrintGCDetails -XX:+PrintGCDateStamps -XX:+HeapDumpOnOutOfMemoryError

Young generation size: 1 GB, using ParNewGC; Old generation size: 2 GB, using ConcMarkSweepGC. CMS triggers when old‑generation usage reaches 80%.

2.2 Locating the Problem

Because the alerts were concentrated in the morning, a scheduled task was suspected. Two candidate cron jobs were examined:

Task 1: CustomerScheduleJobService.updateCustomerDataDaily  0 0/30 8,9,10,11,12 * * ?

Task 2: CustomerStaffScheduleJobService.jobCreateTask  0 10,40 7,8,9,10,11 * * ?

Log analysis showed:

[03-09 08:00:00 062 INFO] … CustomerUpdateInfoDailyBll ‑ (123) … thread begin…
[03-09 08:01:25 476 INFO] … CustomerUpdateInfoDailyBll ‑ (125) … end total=0

Task 1 executed quickly with almost no business logic. Task 2 ran for about 35 minutes (08:10 – 08:45), matching the Full GC window, indicating it as the likely culprit.

2.3 JVM Analysis

2.3.1 Single‑Day Monitoring Charts

Memory and GC trend graphs (shown in the original images) revealed a steady increase in the old generation during Task 2 execution, with only slight reduction after Full GC.

2.3.2 Alarm‑Time‑Window Charts

Similar patterns were observed when focusing on the 08:00‑12:00 window.

2.3.3 Detailed Chart Analysis

2.3.3.1 Old Generation Changes

During the task, the old generation grew noticeably and did not drop significantly after Full GC because objects remained referenced.

2.3.3.2 Survivor Space Changes

The 100 MB survivor space was typically 80 MB occupied, spiking above 90 MB, causing frequent YGC (≈5 times per minute).

2.3.3.3 Non‑Heap / Metaspace / Compressed Class Space

Using jstat , the machine that executed the task showed much larger Metaspace and Compressed Class Space usage compared to a machine without the task.

The method‑area memory grew in sync with the old generation, indicating that old‑generation collection drives its reclamation.

2.3.3.4 Heap Dump Analysis

Heap dump revealed that Groovy‑related classes accounted for 57.57% of the heap.

2.4 Additional Parameter Configuration

Java and Groovy versions:

java version "1.8.0_191"
org.codehaus.groovy
groovy-all
2.4.15

The scheduled task uses Groovy to evaluate rule expressions:

public class GroovyShellUtils {
    private static LoggerHelper logger = LoggerHelper.getLoggerHelper(GroovyShellUtils.class);
    public static boolean explain(String scriptText) {
        try {
            GroovyShell groovyShell = new GroovyShell();
            Object evaluate = groovyShell.evaluate(scriptText);
            return (boolean) evaluate;
        } catch (Exception e) {
            logger.error("", e);
        }
        return false;
    }
}

// usage
for (String rule : rules) {
    boolean res = GroovyShellUtils.explain(rule);
}

Each execution creates a new script class, leading to rapid Metaspace growth (e.g., 111 734 classes for 15 962 users × 7 rules).

protected synchronized String generateScriptName() {
    return "Script" + (++counter) + ".groovy";
}

GroovyShell relies on groovy.lang.GroovyClassLoader , which retains references to all generated classes. Because the classloader itself is never reclaimed, memory leaks occur.

Solution

1. Reuse a single GroovyShell instance for all scripts instead of creating one inside a loop.

2. After each execution, clear the classloader cache with shell.getClassLoader().clearCache() to release generated classes.

backendJVMPerformanceMemoryLeakGroovyFullGC
Zhuanzhuan Tech
Written by

Zhuanzhuan Tech

A platform for Zhuanzhuan R&D and industry peers to learn and exchange technology, regularly sharing frontline experience and cutting‑edge topics. We welcome practical discussions and sharing; contact waterystone with any questions.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.