Databases 2 min read

Fast Migration of Millions of MySQL Rows: Step-by-Step Guide

This article outlines a practical method for migrating MySQL tables with millions of rows by exporting data to an outfile, compressing, transferring, and loading it on the target server, while also addressing strict SQL mode errors that can arise during import.

ZhiKe AI
ZhiKe AI
ZhiKe AI
Fast Migration of Millions of MySQL Rows: Step-by-Step Guide

Using mysqldump works for small datasets, but handling tables with millions of rows requires a different approach. The following procedure demonstrates how to migrate a large MySQL table efficiently.

Modify my.cnf to disable secure_file_priv:

[mysqld]
secure_file_priv=''

Export the data to a file on the source server:

SELECT * FROM t_message_firebase INTO OUTFILE '/data/t_message_firebase.dat';

Compress the exported file:

zip t_message_firebase.dat.zip /data/t_message_firebase.dat

Transfer the compressed file to the destination server (e.g., using scp or another file‑transfer tool).

Decompress the file on the destination server:

unzip t_message_firebase.dat.zip

Import the data into the target table:

LOAD DATA INFILE '/data/t_message_firebase.txt' INTO TABLE t_message_firebase;

If the import fails with the error "ERROR 1261 (01000): Row 1 doesn't contain data for all columns", it is caused by the server running in STRICT_TRANS_TABLES mode. To resolve this, remove the strict mode setting:

-- Check current sql_mode
SHOW VARIABLES LIKE 'sql_mode';

-- Temporarily disable strict mode for the session
SET sql_mode='';

References:

https://www.cnblogs.com/zst062102/p/13129178.html https://my.oschina.net/u/1018607/blog/857728

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

Data MigrationMySQLLarge TablesSQL ModeLOAD DATA
ZhiKe AI
Written by

ZhiKe AI

We dissect AI-era technologies, tools, and trends with a hardcore perspective. Focused on large models, agents, MCP, function calling, and hands‑on AI development. No fluff, no hype—only actionable insights, source code, and practical ideas. Get a daily dose of intelligence to simplify tech and make efficiency tangible.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.