Backend Development 14 min read

Spring Batch Tutorial: Introduction, Architecture, Core Interfaces, and Practical Implementation

This article provides a comprehensive overview of Spring Batch, covering its purpose, typical business scenarios, core architecture and interfaces, and detailed step‑by‑step code examples for configuring jobs, steps, flows, parallel execution, decision making, nested jobs, data reading and writing, item processing, and job scheduling within a Spring Boot application.

IT Architects Alliance
IT Architects Alliance
IT Architects Alliance
Spring Batch Tutorial: Introduction, Architecture, Core Interfaces, and Practical Implementation

1. Introduction

Spring Batch is a lightweight, comprehensive batch‑processing framework built on Spring, designed for robust enterprise batch jobs but not a scheduler itself; it integrates with external schedulers such as Quartz.

2. Business Scenarios

Periodic batch submission

Concurrent batch processing

Message‑driven staged processing

Massive parallel batch execution

Manual or scheduled restart after failure

Ordered step execution (workflow‑driven)

Skipping records (e.g., on rollback)

Whole‑transaction batch for small batches or existing scripts

3. Core Concepts

3.1 Architecture

The main components are JobRepository (persistence), JobLauncher (starts a Job with JobParameters ), Job (encapsulates the batch process), and Step (a distinct phase of a job).

3.2 Core Interfaces

ItemReader : reads a chunk of items for a step.

ItemProcessor : processes each item.

ItemWriter : writes the processed items.

The typical flow is ItemReader → ItemProcessor → ItemWriter within a Job that may contain multiple Step s.

4. Practical Implementation

4.0 Adding Spring Batch

<parent>
  <groupId>org.springframework.boot</groupId>
  <artifactId>spring-boot-starter-parent</artifactId>
  <version>2.2.5.RELEASE</version>
</parent>
<dependencies>
  <dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-batch</artifactId>
  </dependency>
  <dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-validation</artifactId>
  </dependency>
  <dependency>
    <groupId>mysql</groupId>
    <artifactId>mysql-connector-java</artifactId>
  </dependency>
  <dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-jdbc</artifactId>
  </dependency>
</dependencies>

4.1 Job Configuration

Enable batch processing with @EnableBatchProcessing and define a starter class:

@SpringBootApplication
@EnableBatchProcessing
public class SpringBatchStartApplication {
    public static void main(String[] args) {
        SpringApplication.run(SpringBatchStartApplication.class, args);
    }
}

Example of a simple job with a single step:

@Component
public class FirstJobDemo {
    @Autowired private JobBuilderFactory jobBuilderFactory;
    @Autowired private StepBuilderFactory stepBuilderFactory;
    @Bean
    public Job firstJob() {
        return jobBuilderFactory.get("firstJob")
            .start(step())
            .build();
    }
    private Step step() {
        return stepBuilderFactory.get("step")
            .tasklet((contribution, chunkContext) -> {
                System.out.println("执行步骤....");
                return RepeatStatus.FINISHED;
            }).build();
    }
}

4.2 Flow Control

Multi‑step job with conditional flow:

@Bean
public Job multiStepJob() {
    return jobBuilderFactory.get("multiStepJob2")
        .start(step1())
        .on(ExitStatus.COMPLETED.getExitCode()).to(step2())
        .from(step2()).on(ExitStatus.COMPLETED.getExitCode()).to(step3())
        .from(step3()).end()
        .build();
}

Parallel execution using split and SimpleAsyncTaskExecutor :

@Bean
public Job splitJob() {
    return jobBuilderFactory.get("splitJob")
        .start(flow1())
        .split(new SimpleAsyncTaskExecutor())
        .add(flow2())
        .end()
        .build();
}

4.3 Decision Making

Custom JobExecutionDecider to choose flow based on weekday/weekend:

@Component
public class MyDecider implements JobExecutionDecider {
    @Override
    public FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecution) {
        DayOfWeek day = LocalDate.now().getDayOfWeek();
        return (day == DayOfWeek.SATURDAY || day == DayOfWeek.SUNDAY)
            ? new FlowExecutionStatus("weekend")
            : new FlowExecutionStatus("workingDay");
    }
}

4.4 Nested Jobs

Convert a child job into a Step and embed it in a parent job:

private Step childJobOneStep() {
    return new JobStepBuilder(new StepBuilder("childJobOneStep"))
        .job(childJobOne())
        .launcher(jobLauncher)
        .repository(jobRepository)
        .transactionManager(platformTransactionManager)
        .build();
}

4.5 Reading Data

File reader example using FlatFileItemReader and a line mapper to map CSV rows to a POJO TestData :

@Data
public class TestData {
    private int id;
    private String field1;
    private String field2;
    private String field3;
}
FlatFileItemReader
reader = new FlatFileItemReader<>();
reader.setResource(new ClassPathResource("reader/file"));
reader.setLinesToSkip(1);
DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
 tokenizer.setNames("id","field1","field2","field3");
DefaultLineMapper
mapper = new DefaultLineMapper<>();
mapper.setLineTokenizer(tokenizer);
mapper.setFieldSetMapper(fieldSet -> {
    TestData data = new TestData();
    data.setId(fieldSet.readInt("id"));
    data.setField1(fieldSet.readString("field1"));
    data.setField2(fieldSet.readString("field2"));
    data.setField3(fieldSet.readString("field3"));
    return data;
});
reader.setLineMapper(mapper);

4.6 Writing Data

File writer that converts TestData objects to JSON strings using Jackson:

FlatFileItemWriter
writer = new FlatFileItemWriter<>();
FileSystemResource file = new FileSystemResource("D:/code/spring-batch-demo/src/main/resources/writer/writer-file");
writer.setResource(file);
writer.setLineAggregator(item -> {
    try {
        return new ObjectMapper().writeValueAsString(item);
    } catch (JsonProcessingException e) {
        e.printStackTrace();
        return "";
    }
});
writer.afterPropertiesSet();

4.7 Item Processing

Using BeanValidatingItemProcessor to validate POJOs during processing:

BeanValidatingItemProcessor
processor = new BeanValidatingItemProcessor<>();
processor.afterPropertiesSet();
// processor.setFilter(true); // optional filtering of invalid items

4.8 Job Scheduling

Expose a REST endpoint to launch a job with a message parameter:

@RestController
@RequestMapping("job")
public class JobController {
    @Autowired private Job job;
    @Autowired private JobLauncher jobLauncher;
    @GetMapping("launcher/{message}")
    public String launcher(@PathVariable String message) throws Exception {
        JobParameters params = new JobParametersBuilder()
            .addString("message", message)
            .toJobParameters();
        jobLauncher.run(job, params);
        return "success";
    }
}

The tutorial also includes notes on integrating with external schedulers such as Quartz or XXL‑Job.

Javabackend developmentbatch processingSpring FrameworkJob SchedulingSpring Batch
IT Architects Alliance
Written by

IT Architects Alliance

Discussion and exchange on system, internet, large‑scale distributed, high‑availability, and high‑performance architectures, as well as big data, machine learning, AI, and architecture adjustments with internet technologies. Includes real‑world large‑scale architecture case studies. Open to architects who have ideas and enjoy sharing.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.