Databases 11 min read

Using MySQL Shell Util import_table and import_json for Fast Data Import

This article demonstrates how to use MySQL Shell's util.import_table and util.import_json functions to import CSV and JSON data efficiently, compares their performance with the traditional mysqlimport tool, explains required options such as file path, schema, table, dialect, threading, and shows example sessions in both JavaScript and Python modes.

Aikesheng Open Source Community
Aikesheng Open Source Community
Aikesheng Open Source Community
Using MySQL Shell Util import_table and import_json for Fast Data Import

Background MySQL 8.0 Shell provides two util sub‑modules, import_table (JS and Python variants) and import_json , which communicate via the classic MySQL protocol and the X Plugin protocol respectively. Traditional load data infile and mysqlimport import data single‑threaded, taking about 24 seconds for 1 000 000 rows.

Analysis The Shell util tools require local_infile to be enabled. import_table accepts a file path and an options object (schema, table, field delimiter, progress display, thread count, etc.).

Importing CSV with import_table

Example session (Python mode):

MySQL 127.0.0.1:3306 ssl Py> \c [email protected]:3306
Creating a session to '[email protected]:3306'
MySQL 127.0.0.1:3306 ssl Py> \use ytt
Default schema set to `ytt`.
MySQL 127.0.0.1:3306 ssl Py> y_file1='/var/lib/mysql-files/tl1.csv'
MySQL 127.0.0.1:3306 ssl Py> y_options1={"schema":"ytt","table":"tl1","fieldsTerminatedBy":",","showProgress":True,"threads":4}
MySQL 127.0.0.1:3306 ssl Py> util.import_table(y_file1,y_options1);
Importing from file '/var/lib/mysql-files/tl1.csv' to table `ytt`.`tl1` ...
File '/var/lib/mysql-files/tl1.csv' (40.87 MB) was imported in 16.74 sec at 2.44 MB/s

The import completed in under 17 seconds, noticeably faster than mysqlimport .

Using dialect option The dialect option can simplify the command by pre‑defining delimiters. Setting dialect":"csv-unix" yields similar performance with fewer parameters.

Importing JSON

Create a JSON table:

MySQL 127.0.0.1:3306 ssl Py> \sql create table tl1_json(
    id bigint primary key,
    content json);
Query OK, 0 rows affected

Define file and options:

MySQL 127.0.0.1:3306 ssl Py> y_file2='/var/lib/mysql-files/tl1.json'
MySQL 127.0.0.1:3306 ssl Py> rows=['content']
MySQL 127.0.0.1:3306 ssl Py> y_options2={"schema":"ytt","table":"tl1_json","dialect":"json","showProgress":True,"threads":4,'columns':rows}
MySQL 127.0.0.1:3306 ssl Py> util.import_table(y_file2,y_options2);
File '/var/lib/mysql-files/tl1.json' (90.15 MB) was imported in 23.35 sec at 3.86 MB/s

Importing JSON via the X Plugin protocol can also be done with util.import_json after switching to the X port (33060):

MySQL 127.0.0.1:33060+ ssl Py> \c [email protected]:33060
Creating a session to '[email protected]:33060'
MySQL 127.0.0.1:33060+ ssl Py> \use ytt
MySQL 127.0.0.1:33060+ ssl Py> y_options3={"schema":"ytt","table":"tl1_json","tableColumn":"content"}
MySQL 127.0.0.1:33060+ ssl Py> util.import_json(y_file2,y_options3);
Processed 90.15 MB in 1 048 576 documents in 35.24 sec

While import_json runs single‑threaded and is slower (≈35 seconds), it supports seamless conversion between BSON and JSON.

Conclusion MySQL Shell util functions provide a fast, flexible way to bulk‑load CSV and JSON data, often outperforming the classic mysqlimport tool, especially when multi‑threading and appropriate dialect options are used.

MySQLshelldata-importimport_tableimport_json
Aikesheng Open Source Community
Written by

Aikesheng Open Source Community

The Aikesheng Open Source Community provides stable, enterprise‑grade MySQL open‑source tools and services, releases a premium open‑source component each year (1024), and continuously operates and maintains them.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.