How to Insert 300,000 Records Efficiently with MyBatis and JDBC
This article demonstrates how to insert 300,000 rows into a MySQL table using MyBatis and JDBC, compares naïve single‑batch and per‑row approaches, presents optimized batch‑processing code with configurable batch sizes and wait times, and shares practical performance‑tuning tips such as index handling and connection‑pool configuration.
Overview
The article explains how to insert a large volume of data (300,000 rows) into a MySQL t_user table using MyBatis and JDBC, evaluates different insertion strategies, and provides performance‑optimisation recommendations.
Table Structure
CREATE TABLE `t_user` (
`id` int(11) NOT NULL AUTO_INCREMENT COMMENT '用户id',
`username` varchar(64) DEFAULT NULL COMMENT '用户名称',
`age` int(4) DEFAULT NULL COMMENT '年龄',
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='用户信息表';Entity, Mapper and Configuration
User Entity
/**
* <p>用户实体</p>
* @Author zjq
*/
@Data
public class User {
private int id;
private String username;
private int age;
}Mapper Interface
public interface UserMapper {
/**
* Batch insert users
* @param userList
*/
void batchInsertUser(@Param("list") List<User> userList);
}Mapper XML
<!-- Batch insert users -->
<insert id="batchInsertUser" parameterType="java.util.List">
insert into t_user(username,age) values
<foreach collection="list" item="item" separator=",">
(#{item.username}, #{item.age})
</foreach>
</insert>jdbc.properties
jdbc.driver=com.mysql.jdbc.Driver
jdbc.url=jdbc:mysql://localhost:3306/test
jdbc.username=root
jdbc.password=rootsqlMapConfig.xml (excerpt)
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE configuration PUBLIC "-//mybatis.org//DTD Config 3.0//EN" "http://mybatis.org/dtd/mybatis-3-config.dtd">
<configuration>
<properties resource="jdbc.properties"/>
<typeAliases>
<typeAlias type="com.zjq.domain.User" alias="user"/>
</typeAliases>
<environments default="development">
<environment id="development">
<transactionManager type="JDBC"/>
<dataSource type="POOLED">
<property name="driver" value="${jdbc.driver}"/>
<property name="url" value="${jdbc.url}"/>
<property name="username" value="${jdbc.username}"/>
<property name="password" value="${jdbc.password}"/>
</dataSource>
</environment>
</environments>
<mappers>
<mapper resource="com/zjq/mapper/UserMapper.xml"/>
</mappers>
</configuration>Naïve Single‑Batch Attempt
Attempting to insert all 300,000 rows in one batch triggers a MySQL PacketTooBigException because the packet exceeds max_allowed_packet (error shown in the image below).
Per‑Row Insertion (Very Slow)
Inserting rows one by one using MyBatis results in extremely long execution time (over 4 hours). The console output and performance graphs are shown below.
Optimised Batch Insertion with MyBatis
Batch size of 1,000 rows with a 10‑second wait between batches reduces the total time to about 24 seconds. Increasing the batch size to 5,000 rows further reduces the time to roughly 13 seconds.
@Test
public void testBatchInsertUser() throws IOException {
InputStream is = Resources.getResourceAsStream("sqlMapConfig.xml");
SqlSessionFactory factory = new SqlSessionFactoryBuilder().build(is);
SqlSession session = factory.openSession();
System.out.println("===== 开始插入数据 =====");
long start = System.currentTimeMillis();
List<User> list = new ArrayList<>();
for (int i = 1; i <= 300000; i++) {
User u = new User();
u.setId(i);
u.setUsername("共饮一杯无 " + i);
u.setAge((int)(Math.random()*100));
list.add(u);
if (i % 1000 == 0) {
session.insert("batchInsertUser", list);
session.commit();
list.clear();
Thread.sleep(10 * 1000); // optional wait
}
}
if (!list.isEmpty()) {
session.insert("batchInsertUser", list);
session.commit();
}
long spent = System.currentTimeMillis() - start;
System.out.println("成功插入 30 万条数据,耗时:" + spent + "毫秒");
session.close();
}JDBC Batch Insertion
The same data can be inserted using plain JDBC batch processing. The code below disables auto‑commit, adds 1,000‑row batches to a PreparedStatement, and commits after each batch.
@Test
public void testJDBCBatchInsertUser() throws IOException {
Connection conn = DriverManager.getConnection("jdbc:mysql://localhost:3306/test", "root", "root");
conn.setAutoCommit(false);
String sql = "INSERT INTO t_user (username, age) VALUES (?, ?)";
PreparedStatement ps = conn.prepareStatement(sql);
Random rand = new Random();
long start = System.currentTimeMillis();
for (int i = 1; i <= 300000; i++) {
ps.setString(1, "共饮一杯无 " + i);
ps.setInt(2, rand.nextInt(100));
ps.addBatch();
if (i % 1000 == 0) {
ps.executeBatch();
conn.commit();
System.out.println("成功插入第 " + i + " 条数据");
}
}
ps.executeBatch();
conn.commit();
long spent = System.currentTimeMillis() - start;
System.out.println("成功插入 30 万条数据,耗时:" + spent + "毫秒");
ps.close();
conn.close();
}Performance Optimisation Tips
Use batch processing (e.g., 1,000–5,000 rows per batch) to reduce round‑trips and network overhead.
Temporarily drop indexes before bulk loading and recreate them afterwards.
Employ a connection pool to avoid the cost of repeatedly opening/closing connections.
Adjust MySQL parameters such as max_allowed_packet and buffer sizes for large payloads.
Introduce a modest wait time between batches if memory pressure is a concern.
Architecture Digest
Focusing on Java backend development, covering application architecture from top-tier internet companies (high availability, high performance, high stability), big data, machine learning, Java architecture, and other popular fields.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
