Backend Development 13 min read

Optimizing Large-Scale Excel Import/Export in Java with EasyExcel and Thread Pools

By combining EasyExcel’s low‑memory parsing with Java 8 functional style, reflection‑based generic annotations, a thread‑pool‑driven batch listener, and flexible export utilities that support dynamic headers and map‑based rows, Java back‑ends can safely import and export millions of Excel rows without OOM errors.

Java Tech Enthusiast
Java Tech Enthusiast
Java Tech Enthusiast
Optimizing Large-Scale Excel Import/Export in Java with EasyExcel and Thread Pools

Java web applications often need to import and export large Excel files. Traditional libraries such as EasyPOI or Hutool can cause severe memory consumption, leading to OOM errors when processing massive datasets.

EasyExcel provides a more memory‑efficient solution, but additional techniques are required to handle very large volumes safely.

Key optimization approaches:

Use Java 8 functional programming to reduce boilerplate code during data import.

Leverage reflection and generic annotations to map any Excel sheet to a single import interface.

Employ a thread‑pool‑based listener to batch‑process rows, minimizing memory pressure.

Implement generic export methods that can write arbitrary object lists or dynamic header structures.

Maven dependency for EasyExcel

<!-- EasyExcel dependency -->
<dependency>
    <groupId>com.alibaba</groupId>
    <artifactId>easyexcel</artifactId>
    <version>3.0.5</version>
</dependency>

Entity example (StuInfo)

@Data
@NoArgsConstructor
@AllArgsConstructor
@TableName("stu_info")
@ApiModel("学生信息")
public class StuInfo implements Serializable {
    private static final long serialVersionUID = 1L;

    @ColumnWidth(10)
    @ApiModelProperty("姓名")
    @ExcelProperty(value = "姓名", order = 0)
    @ExportHeader(value = "姓名", index = 1)
    private String name;

    @ExcelProperty(value = "年龄", order = 1)
    @ApiModelProperty("年龄")
    @ExportHeader(value = "年龄", index = 2)
    private Integer age;

    @ExcelProperty(value = "身高", order = 2)
    @ApiModelProperty("身高")
    @ExportHeader(value = "身高", index = 4)
    private Double tall;

    @ExcelProperty(value = "自我介绍", order = 3)
    @ApiModelProperty("自我介绍")
    @ExportHeader(value = "自我介绍", index = 3, ignore = true)
    private String selfIntroduce;

    @ExcelProperty(value = "图片信息", order = 4)
    @ApiModelProperty("图片信息")
    @ExportHeader(value = "图片信息", ignore = true)
    private Blob picture;

    @ExcelProperty(value = "性别", order = 5)
    @ApiModelProperty("性别")
    private Integer gender;

    @DateTimeFormat("yyyy-MM-dd HH:mm:ss:")
    @ExcelProperty(value = "入学时间", order = 6)
    @ApiModelProperty("入学时间")
    private String intake;

    @ExcelProperty(value = "出生日期", order = 7)
    @ApiModelProperty("出生日期")
    private String birthday;
}

ReadListener implementation (batch processing)

@Slf4j
public class UploadDataListener
implements ReadListener
{
    private static final int BATCH_COUNT = 100;
    private List
cachedDataList = ListUtils.newArrayListWithExpectedSize(BATCH_COUNT);
    private Predicate
predicate;
    private Consumer
> consumer;

    public UploadDataListener(Predicate
predicate, Consumer
> consumer) {
        this.predicate = predicate;
        this.consumer = consumer;
    }
    public UploadDataListener(Consumer
> consumer) {
        this.consumer = consumer;
    }

    @Override
    public void invoke(T data, AnalysisContext context) {
        if (predicate != null && !predicate.test(data)) {
            return;
        }
        cachedDataList.add(data);
        if (cachedDataList.size() >= BATCH_COUNT) {
            try {
                consumer.accept(cachedDataList);
            } catch (Exception e) {
                log.error("Failed to upload data! data={}", cachedDataList);
                throw new BizException("导入失败");
            }
            cachedDataList = ListUtils.newArrayListWithExpectedSize(BATCH_COUNT);
        }
    }

    @Override
    public void doAfterAllAnalysed(AnalysisContext context) {
        if (CollUtil.isNotEmpty(cachedDataList)) {
            try {
                consumer.accept(cachedDataList);
                log.info("所有数据解析完成!");
            } catch (Exception e) {
                log.error("Failed to upload data! data={}", cachedDataList);
                if (e instanceof BizException) {
                    throw e;
                }
                throw new BizException("导入失败");
            }
        }
    }
}

Controller endpoint using the listener

@ApiOperation("只需要一个readListener,解决全部的问题")
@PostMapping("/update")
@ResponseBody
public R
aListener4AllExcel(MultipartFile file) throws IOException {
    try {
        EasyExcel.read(file.getInputStream(), StuInfo.class,
                new UploadDataListener
(list -> {
                    ValidationUtils.validate(list);
                    service.saveBatch(list);
                    log.info("从Excel导入数据一共 {} 行 ", list.size());
                }))
                .sheet()
                .doRead();
    } catch (IOException e) {
        log.error("导入失败", e);
        throw new BizException("导入失败");
    }
    return R.success("SUCCESS");
}

For scenarios where the target class is unknown, a Map<Integer, String> based listener can be used. It reads each row into a map, batches rows, and persists them via a custom consumer. Header rows can be captured with invokeHead to align Excel columns with database fields, ensuring correct ordering even when the template differs from the source.

Header handling example

@Override
public void invokeHead(Map
> headMap, AnalysisContext context) {
    Map
columnMap = ConverterUtils.convertToStringMap(headMap, context);
    Map
columnList = new HashMap(); // fetch real DB columns
    columnMap.forEach((key, value) -> {
        if (columnList.containsKey(value)) {
            filterList.add(key);
        }
    });
    log.info("解析到一条头数据:{}", JSON.toJSONString(columnMap));
}

When processing extremely large files, the listener can be executed inside a thread pool to fully utilize CPU cores. The same pattern applies to export operations: a generic commonExport method writes any object list to an Excel file, while exportFreely accepts dynamic headers and data matrices for fully custom reports.

Generic export method

public
void commonExport(String fileName, List
data, Class
clazz, HttpServletResponse response) throws IOException {
    if (CollectionUtil.isEmpty(data)) {
        data = new ArrayList<>();
    }
    fileName = URLEncoder.encode(fileName, "UTF-8");
    response.setContentType("application/vnd.ms-excel");
    response.setCharacterEncoding("utf-8");
    response.setHeader("Content-disposition", "attachment;filename=" + fileName + ".xlsx");
    EasyExcel.write(response.getOutputStream()).head(clazz).sheet("sheet1").doWrite(data);
}

These techniques together enable safe, high‑performance Excel import/export for Java backend services handling millions of rows.

JavaData ProcessingEasyExcelExcel importthread pool
Java Tech Enthusiast
Written by

Java Tech Enthusiast

Sharing computer programming language knowledge, focusing on Java fundamentals, data structures, related tools, Spring Cloud, IntelliJ IDEA... Book giveaways, red‑packet rewards and other perks await!

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.