Tag

large-data

0 views collected around this technical thread.

php中文网 Courses
php中文网 Courses
Mar 5, 2025 · Backend Development

Using PHP Generator Functions to Create Infinite Iterable Objects for Large Data Processing

The article explains how PHP generator functions can create infinite iterable objects, such as a Fibonacci sequence, to process large data sets efficiently by yielding values on demand, reducing memory usage and improving performance, with practical code examples and discussion of advantages and use cases.

GeneratorMemory Optimizationlarge-data
0 likes · 5 min read
Using PHP Generator Functions to Create Infinite Iterable Objects for Large Data Processing
php中文网 Courses
php中文网 Courses
Jun 12, 2024 · Backend Development

Using PHP Generator Functions to Create Infinite Iterable Objects for Large Data Processing

This article explains how PHP generator functions can create infinite iterable objects, demonstrating with a Fibonacci sequence example to efficiently handle large data sets while reducing memory usage and improving performance in backend development.

GeneratorInfinite SequenceMemory Optimization
0 likes · 5 min read
Using PHP Generator Functions to Create Infinite Iterable Objects for Large Data Processing
php中文网 Courses
php中文网 Courses
Dec 1, 2023 · Backend Development

Using PHP Generators to Process Large Data Sets and Prevent Memory Exhaustion

This article explains how PHP developers can use generators to iterate over large data sets without exhausting memory, covering the concept, syntax with the yield keyword, step‑by‑step examples, converting regular functions, handling key‑value pairs, sending data back, returning values, and a real‑world file‑reading use case.

GeneratorsMemory Managementbackend
0 likes · 5 min read
Using PHP Generators to Process Large Data Sets and Prevent Memory Exhaustion
JD Tech
JD Tech
Nov 9, 2023 · Databases

Optimizing Pagination Queries for Billion‑Row MySQL Tables

This article analyzes the performance problems of LIMIT‑based pagination on massive MySQL tables and presents three progressively more efficient solutions—including a simple LIMIT approach, a tag‑record method using the last primary key, and a range‑limit method with cached min‑ID—along with best‑practice indexing recommendations to keep query latency in the tens of milliseconds even for tables containing billions of rows.

Index OptimizationMySQLSQL
0 likes · 12 min read
Optimizing Pagination Queries for Billion‑Row MySQL Tables
JD Retail Technology
JD Retail Technology
Sep 13, 2023 · Databases

Optimizing Pagination Queries for Billion‑Row MySQL Tables

The article analyzes the performance problems of deep pagination on massive MySQL tables storing billions of fan records and presents three progressive solutions—simple LIMIT, tag‑record (maxId) pagination, and range‑limited pagination with async and offline minId caching—along with general indexing best‑practices for high‑throughput queries.

IndexingMySQLQuery Optimization
0 likes · 10 min read
Optimizing Pagination Queries for Billion‑Row MySQL Tables
Rare Earth Juejin Tech Community
Rare Earth Juejin Tech Community
Sep 6, 2023 · Frontend Development

Efficient Rendering of Large Datasets in Vue.js Frontend Applications

This article demonstrates several techniques—including server‑side data simulation, batch rendering with timers, requestAnimationFrame, pagination components, infinite scroll, and virtual lists—to efficiently display and interact with 100,000 records in a Vue.js front‑end without causing UI freezes or performance degradation.

Infinite ScrollJavaScriptPerformance
0 likes · 21 min read
Efficient Rendering of Large Datasets in Vue.js Frontend Applications
Architect's Tech Stack
Architect's Tech Stack
Aug 7, 2023 · Databases

High‑Speed Bulk Loading of 20 Billion Rows into MySQL Using TokuDB

This article details a real‑world test of loading over 20 billion records into MySQL with XeLabs TokuDB, covering the demand, configuration tweaks, table schema, bulk‑loader commands, performance metrics, comparison with InnoDB, and practical conclusions for large‑scale data ingestion.

Bulk LoadingDatabase OptimizationMySQL
0 likes · 7 min read
High‑Speed Bulk Loading of 20 Billion Rows into MySQL Using TokuDB
Laravel Tech Community
Laravel Tech Community
Jul 5, 2023 · Databases

Performance Testing and Optimization of Large‑Scale MySQL Queries

This article demonstrates how to generate, insert, and query millions of rows in a MySQL 5.7 table, measures pagination performance under varying offsets and result sizes, and presents several optimization techniques—including sub‑queries, indexed look‑ups, and column selection—to dramatically reduce query latency.

DatabaseMySQLPerformance Testing
0 likes · 10 min read
Performance Testing and Optimization of Large‑Scale MySQL Queries
Architect's Guide
Architect's Guide
Jun 28, 2023 · Databases

Testing MySQL Pagination Performance on Large Datasets

This article demonstrates how to create a 10‑million‑row MySQL table, batch‑insert data via a stored procedure, measure ordinary LIMIT pagination versus offset‑optimized queries, and provides practical tips for improving query speed on massive tables.

Index OptimizationMySQLPerformance Testing
0 likes · 9 min read
Testing MySQL Pagination Performance on Large Datasets
Top Architect
Top Architect
May 9, 2023 · Databases

Performance Testing and Optimization of MySQL Pagination for Large Datasets

This article demonstrates how to generate, insert, and query ten‑million‑row MySQL tables, measures the latency of ordinary LIMIT pagination, analyzes the impact of offset size and result set size, and presents practical optimization techniques such as sub‑query pagination and ID‑range filtering to dramatically improve query speed.

MySQLOptimizationPerformance
0 likes · 12 min read
Performance Testing and Optimization of MySQL Pagination for Large Datasets
Architect's Guide
Architect's Guide
Apr 1, 2023 · Big Data

Handling Large Data Queries in MySQL with MyBatis: Regular, Stream, and Cursor Approaches

The article explains how to efficiently retrieve and process massive MySQL result sets in Java using MyBatis, comparing regular pagination, streaming queries via Cursor, and cursor-based fetchSize techniques, and provides practical code examples and best‑practice tips to avoid OOM and improve performance.

CursorFetchSizeMyBatis
0 likes · 10 min read
Handling Large Data Queries in MySQL with MyBatis: Regular, Stream, and Cursor Approaches
Top Architect
Top Architect
Mar 15, 2023 · Databases

Handling Large Data Sets in MySQL: Regular, Streaming, and Cursor Queries with MyBatis

The article explains how to process massive MySQL data sets—covering data migration, export, and batch handling—by comparing regular pagination, streaming queries using server‑side cursors, and cursor‑based fetchSize techniques, and provides concrete MyBatis code examples for each approach.

CursorDatabase OptimizationMyBatis
0 likes · 8 min read
Handling Large Data Sets in MySQL: Regular, Streaming, and Cursor Queries with MyBatis
Architecture Digest
Architecture Digest
May 9, 2021 · Databases

Optimizing MySQL Pagination with LIMIT: Methods, Experiments, and Index Strategies

This article examines the performance drawbacks of MySQL's LIMIT pagination on large tables, presents six practical query methods—including direct LIMIT, primary‑key indexing, index‑based ordering, prepared statements, covering indexes, and sub‑query/join techniques—provides extensive benchmark results, and offers concrete indexing recommendations to achieve fast, stable pagination even with millions of rows.

IndexingMySQLPerformance
0 likes · 12 min read
Optimizing MySQL Pagination with LIMIT: Methods, Experiments, and Index Strategies
Architect's Tech Stack
Architect's Tech Stack
Jan 9, 2021 · Databases

High‑Speed Loading of 2 Billion Rows into MySQL Using TokuDB

This article describes a real‑world requirement to import over 2 billion records into MySQL, analyzes the challenges, introduces XeLabs TokuDB with its optimizations, details the test schema and configuration, demonstrates bulk loading commands, presents performance metrics showing up to 570 k rows per second, and concludes with practical recommendations and environment details.

Bulk LoadDatabase OptimizationMySQL
0 likes · 7 min read
High‑Speed Loading of 2 Billion Rows into MySQL Using TokuDB
Top Architect
Top Architect
Jan 2, 2021 · Backend Development

Optimizing Large-Scale Excel Import/Export with Apache POI to Avoid OOM and Reduce GC

This article explains how to prevent frequent full GC and out‑of‑memory errors when using Apache POI for massive Excel imports and exports by leveraging SXSSFWorkbook for XLSX, choosing appropriate processing models for XLS/XLSX, and provides performance test results and code samples.

Apache POIExcelJava
0 likes · 18 min read
Optimizing Large-Scale Excel Import/Export with Apache POI to Avoid OOM and Reduce GC
Architecture Digest
Architecture Digest
Aug 26, 2020 · Databases

SQL Query Optimization for Large‑Scale MES Reporting Using Stored Procedures

This article details how to transform a painfully slow SSRS report that scanned billions of rows into a high‑performance solution by analyzing the original SQL, fixing indexing and partitioning mistakes, and rewriting the logic as a flexible stored procedure that runs in seconds.

Database OptimizationMESSQL
0 likes · 21 min read
SQL Query Optimization for Large‑Scale MES Reporting Using Stored Procedures
Architect's Tech Stack
Architect's Tech Stack
Apr 8, 2019 · Databases

High‑Performance Bulk Loading of Over 2 Billion Rows into MySQL Using TokuDB

This article describes how to ingest more than two billion records into MySQL by leveraging XeLabs TokuDB’s bulk‑loader, detailing configuration, table schema, performance metrics, and a comparison with InnoDB to demonstrate a three‑to‑fourfold speed improvement.

Bulk LoadDatabase OptimizationMySQL
0 likes · 7 min read
High‑Performance Bulk Loading of Over 2 Billion Rows into MySQL Using TokuDB
Java Captain
Java Captain
Feb 14, 2019 · Databases

High‑Performance Bulk Loading of Over 2 Billion Rows into MySQL Using XeLabs TokuDB

This article describes how to quickly import more than two billion rows from a big‑data source into MySQL by leveraging XeLabs TokuDB’s bulk‑loader, showing configuration, code examples, performance results, and practical recommendations for handling large‑scale data ingestion.

Bulk LoadDatabase OptimizationMySQL
0 likes · 6 min read
High‑Performance Bulk Loading of Over 2 Billion Rows into MySQL Using XeLabs TokuDB
37 Interactive Technology Team
37 Interactive Technology Team
Jun 20, 2018 · Backend Development

Handling Large Arrays in PHP: File and Database Processing Strategies

To prevent memory overflow when processing massive arrays in PHP, read large files line‑by‑line, batch database queries, promptly unset variables, and prefer a pre‑counted for‑loop over foreach for mutable collections, thereby freeing memory gradually and ensuring stable script execution.

DatabaseMemory ManagementPHP
0 likes · 9 min read
Handling Large Arrays in PHP: File and Database Processing Strategies