Tag

data-import

0 views collected around this technical thread.

Code Ape Tech Column
Code Ape Tech Column
May 9, 2025 · Databases

Efficient Strategies for Importing One Billion Records into MySQL

This article explains how to import 1 billion 1 KB log records stored in HDFS or S3 into MySQL by analyzing single‑table limits, using batch inserts, choosing storage engines, sharding, optimizing file‑reading methods, and coordinating distributed tasks with Redis, Redisson, and Zookeeper to ensure ordered, reliable, and high‑throughput data loading.

KafkaMySQLPerformance Tuning
0 likes · 19 min read
Efficient Strategies for Importing One Billion Records into MySQL
Architect's Guide
Architect's Guide
Jan 3, 2025 · Big Data

Efficient Import and Export of Millions of Records Using POI and EasyExcel in Java

This article explains how to handle massive Excel‑DB import/export tasks in Java by comparing POI workbook types, selecting the right implementation, and leveraging EasyExcel with batch queries, sheet splitting, and JDBC batch inserts to process over three million rows efficiently.

Big DataEasyExcelExcel
0 likes · 24 min read
Efficient Import and Export of Millions of Records Using POI and EasyExcel in Java
Test Development Learning Exchange
Test Development Learning Exchange
Dec 18, 2024 · Fundamentals

How to Import and Export Data in Pandas

This guide explains how to use Pandas to import data from various file formats such as CSV, Excel, JSON, SQL, HTML, HDF5, Pickle, TSV, fixed‑width files and the clipboard, and also demonstrates how to export DataFrames to formats like CSV, Excel, JSON, SQL, HTML, HDF5 and Pickle, providing clear code examples for each operation.

Pythondata analysisdata-export
0 likes · 10 min read
How to Import and Export Data in Pandas
Baidu Tech Salon
Baidu Tech Salon
Oct 22, 2024 · Big Data

TDE-ClickHouse: Baidu MEG's High-Performance Big Data Analytics Engine

TDE‑ClickHouse, the core engine of Baidu MEG’s Turing 3.0 ecosystem, delivers sub‑second, self‑service analytics on petabyte‑scale data by decoupling compute, adding multi‑level aggregation, high‑cardinality and rule‑based optimizations, a two‑phase bulk‑load pipeline, cloud‑native deployment, and a lightweight meta service, now powering over 350 000 cores, 10 PB storage and more than 150 000 daily BI queries with average response times under three seconds.

ClickHouseCloud NativeDatabase Architecture
0 likes · 19 min read
TDE-ClickHouse: Baidu MEG's High-Performance Big Data Analytics Engine
php中文网 Courses
php中文网 Courses
Jul 19, 2024 · Backend Development

10 Innovative PHP Project Ideas to Elevate Your Portfolio

This article presents ten carefully crafted PHP project concepts—from building a custom MVC framework and implementing data import/export to creating QR code generators and event management tools—each designed to showcase backend expertise and enhance a developer's professional portfolio.

CRUDEmailMVC
0 likes · 7 min read
10 Innovative PHP Project Ideas to Elevate Your Portfolio
Code Ape Tech Column
Code Ape Tech Column
Jun 14, 2024 · Databases

Designing an Efficient Import of 1 Billion Records into MySQL: Architecture, Batch Loading, Sharding, and Concurrency Control

This article analyzes how to import one billion 1 KB log records stored in HDFS or S3 into MySQL by evaluating single‑table limits, choosing storage engines, designing sharding, batch insertion, file‑reading strategies, task coordination, and distributed locking to achieve high‑throughput and ordered writes.

MySQLPerformancebatch insert
0 likes · 19 min read
Designing an Efficient Import of 1 Billion Records into MySQL: Architecture, Batch Loading, Sharding, and Concurrency Control
Selected Java Interview Questions
Selected Java Interview Questions
Nov 30, 2023 · Databases

Designing a High‑Performance System to Import 1 Billion Records into MySQL

To import one billion 1 KB log records stored in HDFS or S3 into MySQL as quickly as possible, the article analyzes constraints, evaluates single‑table limits, proposes sharding, batch inserts, storage‑engine choices, file‑reading strategies, task coordination, reliability mechanisms, and concurrency control.

MySQLPerformance Optimizationbatch insert
0 likes · 19 min read
Designing a High‑Performance System to Import 1 Billion Records into MySQL
Architecture Digest
Architecture Digest
Nov 27, 2023 · Databases

Fast Import of 1 Billion Records into MySQL: Design, Performance, and Reliability Considerations

To import one billion 1 KB log records into MySQL efficiently, the article examines data size constraints, B‑tree index limits, batch insertion strategies, storage engine choices, file‑reading techniques, task coordination with Redis, Redisson semaphores, and distributed lock handling to ensure ordered, reliable, high‑throughput loading.

Big DataMySQLPerformance
0 likes · 18 min read
Fast Import of 1 Billion Records into MySQL: Design, Performance, and Reliability Considerations
Top Architecture Tech Stack
Top Architecture Tech Stack
Nov 2, 2023 · Databases

Strategies for Efficiently Importing One Billion Records into MySQL

This article analyzes the constraints of loading one billion 1 KB log records from distributed storage into MySQL, evaluates single‑table limits, proposes batch‑insert, sharding, storage‑engine, file‑reading, and distributed‑task coordination techniques to achieve high‑speed, ordered, and reliable data ingestion.

JavaMySQLPerformance
0 likes · 18 min read
Strategies for Efficiently Importing One Billion Records into MySQL
Selected Java Interview Questions
Selected Java Interview Questions
Oct 31, 2023 · Databases

Designing High‑Performance Import of 1 Billion Records into MySQL

To import one billion 1 KB unstructured log entries stored in HDFS or S3 into MySQL efficiently, the article discusses constraints, B+‑tree limits, batch insertion, storage engine choices, sharding, task coordination, concurrency control, progress tracking with Redis, and reliable distributed execution strategies.

MySQLPerformance Optimizationbatch insert
0 likes · 19 min read
Designing High‑Performance Import of 1 Billion Records into MySQL
Code Ape Tech Column
Code Ape Tech Column
Oct 18, 2023 · Databases

Efficient Strategies for Importing One Billion Records into MySQL

This article explains how to import 1 billion 1 KB log records stored in HDFS or S3 into MySQL by analyzing table capacity limits, choosing storage engines, designing batch inserts, coordinating file reading and writing, and handling task reliability with Redis, Redisson, and Zookeeper.

MySQLPerformance OptimizationRedisson
0 likes · 18 min read
Efficient Strategies for Importing One Billion Records into MySQL
Aikesheng Open Source Community
Aikesheng Open Source Community
Oct 12, 2023 · Databases

How to Import Large Volumes of Data into MySQL: Tools, Commands, and Performance Tips

This article compares various MySQL data import/export methods—including mysqldump, mydumper, SELECT OUTFILE, and MySQL Shell utilities—explaining their commands, configuration options, performance results, and best‑practice recommendations based on testing with a 10‑million‑row table.

ExportMySQLMySQL Shell
0 likes · 16 min read
How to Import Large Volumes of Data into MySQL: Tools, Commands, and Performance Tips
Architecture Digest
Architecture Digest
Apr 16, 2023 · Databases

Getting Started with DataGrip: Installation, Configuration, and Common Features

This guide introduces DataGrip, a JetBrains database client, covering download, installation, theme selection, driver management, connection setup, SQL editing shortcuts, data export/import, table creation, navigation, and various productivity features for efficient database development.

DataGripJetBrainsSQL
0 likes · 12 min read
Getting Started with DataGrip: Installation, Configuration, and Common Features
Aikesheng Open Source Community
Aikesheng Open Source Community
Feb 20, 2023 · Databases

Using Control Files with obloader for Data Import and Export in OceanBase

This article explains how to create and use control files with the obloader tool to import and export data in OceanBase, covering template syntax, practical examples, common errors, and multiple solutions for handling column mismatches and data preprocessing.

DatabaseOceanBaseSQL
0 likes · 9 min read
Using Control Files with obloader for Data Import and Export in OceanBase
Top Architect
Top Architect
Jan 31, 2023 · Backend Development

Handling Large-Scale Excel Import/Export with POI and EasyExcel in Java

This article explains how to efficiently import and export massive Excel datasets in Java by comparing POI implementations, selecting the appropriate workbook type, and using EasyExcel with batch processing, pagination, and JDBC transactions to achieve high performance for hundreds of thousands to millions of rows.

EasyExcelExcelJava
0 likes · 23 min read
Handling Large-Scale Excel Import/Export with POI and EasyExcel in Java
Aikesheng Open Source Community
Aikesheng Open Source Community
Dec 21, 2022 · Databases

Importing Existing Histogram Data in MySQL 8.0.31

MySQL 8.0.31 introduces a new syntax that allows importing pre‑computed histogram data directly into a table, dramatically reducing the time needed to update histogram statistics compared with the traditional online creation method.

Database StatisticsMySQLPerformance
0 likes · 6 min read
Importing Existing Histogram Data in MySQL 8.0.31
Java Architect Essentials
Java Architect Essentials
Nov 14, 2022 · Big Data

Efficient Import and Export of Millions of Records Using Apache POI and EasyExcel

This article explains how to handle massive Excel import and export tasks in Java by comparing traditional POI implementations, selecting the appropriate Workbook type based on data volume, and leveraging Alibaba's EasyExcel library together with batch JDBC operations to process over three million rows with minimal memory usage and high performance.

Apache POIBig DataEasyExcel
0 likes · 22 min read
Efficient Import and Export of Millions of Records Using Apache POI and EasyExcel
Model Perspective
Model Perspective
Nov 13, 2022 · Fundamentals

Master Pandas: Install, Import Data, and Perform Powerful Data Analysis

This tutorial introduces the Pandas library, covering installation, data import from CSV and Excel, DataFrame creation, descriptive statistics, indexing with loc/iloc, and applying custom functions to clean and transform column values, all illustrated with code snippets and images.

Pythondata analysisdata-import
0 likes · 6 min read
Master Pandas: Install, Import Data, and Perform Powerful Data Analysis
Architect
Architect
Oct 3, 2022 · Big Data

Efficient Import and Export of Massive Data Using POI and EasyExcel

This article explains how to handle large‑scale Excel import and export in Java by comparing traditional POI workbooks, selecting the appropriate implementation based on data volume and requirements, and presenting a high‑performance solution with EasyExcel, batch processing, and JDBC transactions for hundreds of millions of rows.

EasyExcelJavaPerformance
0 likes · 22 min read
Efficient Import and Export of Massive Data Using POI and EasyExcel
Architect's Tech Stack
Architect's Tech Stack
Sep 29, 2022 · Databases

Comprehensive Guide to Using JetBrains DataGrip for Database Management

This article provides a detailed tutorial on JetBrains DataGrip, covering installation, driver management, connection configuration, SQL editing shortcuts, data import/export, schema visualization, and various productivity features for efficiently working with relational databases.

DataGripDatabase ManagementJetBrains
0 likes · 11 min read
Comprehensive Guide to Using JetBrains DataGrip for Database Management