Databases 9 min read

GPTuner: LLM-Driven PostgreSQL Knob Tuning

GPTuner, an LLM‑driven system for PostgreSQL knob tuning developed by researchers at Sichuan University, demonstrates that knowledge processing, parameter selection, search‑range optimization, and a two‑stage Bayesian framework each significantly improve performance, while costing roughly 880 000 GPT‑4 tokens (≈ $30) with reusable knowledge.

Sohu Tech Products
Sohu Tech Products
Sohu Tech Products
GPTuner: LLM-Driven PostgreSQL Knob Tuning

Authors: Lao Jiale, Wang Yibo, Li Yufeng. Affiliation: Sichuan University Intelligent Systems Laboratory.

Ablation Experiments: After evaluating GPTuner's actual capabilities, researchers further investigated the influence of each module on GPTuner's performance. Four sets of ablation experiments verified the impact of knowledge processing, parameter selection, parameter search range optimization, and the two-stage Bayesian optimization framework on GPTuner's performance.

Cost Analysis: Finally, due to the overhead of API scheduling, we analyzed the cost. When using GPT-4 as the language model, GPTuner consumed approximately 880,000 tokens (about 30 USD) to process the 60 knobs of PostgreSQL. It is worth noting that the constructed knowledge can be reused repeatedly, avoiding the tedious overhead of repeated construction.

For more details, please refer to the paper: https://web1.arxiv.org/abs/2311.03157

Code repository: https://github.com/SolidLao/GPTuner

LLMPostgreSQLcost analysisablation studyDatabase TuningGPTuner
Sohu Tech Products
Written by

Sohu Tech Products

A knowledge-sharing platform for Sohu's technology products. As a leading Chinese internet brand with media, video, search, and gaming services and over 700 million users, Sohu continuously drives tech innovation and practice. We’ll share practical insights and tech news here.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.