How Treating Your Database as a Compute Engine Boosted Dashboard Speed 166×
The article explains why teams mistakenly use databases merely as storage, shows a real‑world Node.js dashboard that suffered severe latency, demonstrates how moving aggregation and filtering into SQL queries eliminated N+1 and filtering issues, and presents dramatic performance gains and best‑practice guidelines.
Background and Problem
For three years the author asked engineers to describe a time they faced a database performance issue. Most answers were "add an index", "switch databases", "add a cache", or "implement read replicas"—all temporary fixes that ignore the root cause.
The core problem is that many teams treat the database as a dumb storage bucket instead of a powerful processing engine.
Naïve Dashboard Implementation
A revenue dashboard fetched all orders and products in Node.js, built lookup maps, and performed aggregation in JavaScript:
async function getRevenueDashboard(filters) {
// Fetch all orders from the last year
const orders = await db.query(`
SELECT id, amount, created_at, product_id, region, customer_id
FROM orders
WHERE created_at > NOW() - INTERVAL '1 year'
`);
// Fetch all products
const products = await db.query(`
SELECT id, name, category
FROM products
`);
const productMap = products.reduce((acc, p) => { acc[p.id] = p; return acc; }, {});
const result = { totalRevenue: 0, byCategory: {}, byRegion: {}, byMonth: {} };
orders.forEach(order => {
const product = productMap[order.product_id];
const category = product?.category || 'Unknown';
const month = order.created_at.toISOString().substring(0, 7);
result.totalRevenue += order.amount;
result.byCategory[category] = (result.byCategory[category] || 0) + order.amount;
result.byRegion[order.region] = (result.byRegion[order.region] || 0) + order.amount;
result.byMonth[month] = (result.byMonth[month] || 0) + order.amount;
});
return result;
}Performance degraded as data grew: 5 000 orders took 200 ms, 350 000 orders took 7.8 s, and the CEO complained.
Typical Fixes That Didn’t Help
Added indexes on created_at and product_id (minor gain).
Implemented Redis cache (helpful but added complexity).
Optimized JavaScript loops (saved ~200 ms).
None addressed the fundamental issue: pulling millions of rows into the application and doing work that the database can perform a thousand times faster.
Database‑Centric Solution
A database consultant rewrote the query to let PostgreSQL handle aggregation, grouping, and filtering:
WITH monthly_revenue AS (
SELECT DATE_TRUNC('month', o.created_at) AS month,
p.category,
o.region,
SUM(o.amount) AS revenue
FROM orders o
JOIN products p ON o.product_id = p.id
WHERE o.created_at > NOW() - INTERVAL '1 year'
GROUP BY DATE_TRUNC('month', o.created_at), p.category, o.region
)
SELECT
(SELECT SUM(revenue) FROM monthly_revenue) AS total_revenue,
(SELECT json_object_agg(category, category_revenue) FROM (
SELECT category, SUM(revenue) AS category_revenue FROM monthly_revenue GROUP BY category) cat) ) AS by_category,
(SELECT json_object_agg(region, region_revenue) FROM (
SELECT region, SUM(revenue) AS region_revenue FROM monthly_revenue GROUP BY region) reg) AS by_region,
(SELECT json_object_agg(month, month_revenue) FROM (
SELECT month, SUM(revenue) AS month_revenue FROM monthly_revenue GROUP BY month ORDER BY month) mon) ) AS by_month;With the same 350 000 orders the response time dropped to 47 ms—a 166× improvement.
Real‑World Code After Refactor
async function getRevenueDashboard(filters) {
const result = await db.query(`
-- That whole SQL query above
`);
return result.rows[0];
}Now Node.js merely forwards the result.
Common Mistakes Illustrated
1. N+1 Queries
// Bad: N+1 queries
const users = await db.query('SELECT * FROM users');
for (const user of users) {
const orders = await db.query('SELECT * FROM orders WHERE user_id = ?', [user.id]);
user.orderCount = orders.length;
user.totalSpent = orders.reduce((sum, o) => sum + o.amount, 0);
}With 1 000 users this runs 1 001 queries.
// Good: One query
const users = await db.query(`
SELECT u.*, COUNT(o.id) AS order_count,
COALESCE(SUM(o.amount), 0) AS total_spent
FROM users u
LEFT JOIN orders o ON u.id = o.user_id
GROUP BY u.id
`);2. Filtering in Application Code
// Bad: fetch everything, filter in code
const products = await db.query('SELECT * FROM products');
const available = products.filter(p => p.stock > 0 && p.price < 100 && p.category === 'electronics' && !p.discontinued);Fetching 50 000 rows just to keep 12.
-- Good: let the database filter
SELECT * FROM products
WHERE stock > 0 AND price < 100 AND category = 'electronics' AND discontinued = false;3. Computing Derived Values in Code
// Bad: compute in application
const orders = await db.query(`SELECT amount, tax_rate FROM orders`);
orders.forEach(o => {
o.taxAmount = o.amount * o.tax_rate;
o.total = o.amount + o.taxAmount;
}); -- Good: compute in database
SELECT amount,
tax_rate,
amount * tax_rate AS tax_amount,
amount + (amount * tax_rate) AS total
FROM orders;When Database‑Side Processing Matters
Reporting and analytics that require aggregation, grouping, or statistical calculations.
Paginated sorted results—let the DB handle ORDER BY and LIMIT/OFFSET.
Complex business logic that can be expressed in SQL (e.g., customer lifetime value segmentation).
ORMs: When to Use Them
ORMs excel at simple CRUD but force object‑centric thinking for complex queries, often pulling far more data than needed.
// Typical ORM approach (fetches too much)
const users = await User.findAll({ include: [{ model: Order, required: false }] });
// Still need to sum orders in JavaScriptGuideline: use ORM for straightforward reads/writes; drop to raw SQL for aggregations, joins, subqueries, window functions, or any operation that returns summarized data.
Under‑Used Database Features
Window Functions
SELECT date,
revenue,
SUM(revenue) OVER (ORDER BY date) AS running_total,
RANK() OVER (ORDER BY revenue DESC) AS revenue_rank
FROM daily_revenue;Common Table Expressions (CTEs)
WITH high_value_customers AS (
SELECT customer_id, SUM(amount) AS total
FROM orders
GROUP BY customer_id
HAVING SUM(amount) > 10000
), recent_orders AS (
SELECT * FROM orders WHERE created_at > NOW() - INTERVAL '30 days'
)
SELECT c.name,
hvc.total AS lifetime_value,
COUNT(ro.id) AS recent_orders
FROM customers c
JOIN high_value_customers hvc ON c.id = hvc.customer_id
LEFT JOIN recent_orders ro ON c.id = ro.customer_id
GROUP BY c.id, c.name, hvc.total;JSON Operations (PostgreSQL)
SELECT id,
metadata->>'name' AS name,
metadata->'preferences'->>'theme' AS theme
FROM users
WHERE metadata @> '{"status": "active"}';Generated Columns
CREATE TABLE products (
id SERIAL PRIMARY KEY,
price DECIMAL(10,2),
tax_rate DECIMAL(3,2),
price_with_tax DECIMAL(10,2) GENERATED ALWAYS AS (price * (1 + tax_rate)) STORED
);Constraints for Business Rules
CREATE TABLE orders (
id SERIAL PRIMARY KEY,
quantity INTEGER CHECK (quantity > 0),
status VARCHAR(20) CHECK (status IN ('pending','processing','completed','cancelled')),
completed_at TIMESTAMP,
CONSTRAINT completed_orders_must_have_date
CHECK (status != 'completed' OR completed_at IS NOT NULL)
);Mindset Shift
Stop thinking "the database only stores data" and start seeing it as a compute engine that should perform aggregation, filtering, and transformation. Let the application handle HTTP, authentication, input validation, and orchestrating calls, while the database stores, processes, aggregates, filters, joins, enforces constraints, and returns exactly the needed result set.
Results After Refactoring
Dashboard load time: 7.8 s → 47 ms (166× faster).
API p95 latency: 3.2 s → 180 ms.
Database CPU usage: down 40 %.
Application memory usage: down 60 %.
Redis cache cost: down 70 %.
Developer happiness: significantly increased.
The team still uses Node.js, React, and an ORM for simple operations, but now treats the database as a powerful engine, delivering a dramatically better experience.
dbaplus Community
Enterprise-level professional community for Database, BigData, and AIOps. Daily original articles, weekly online tech talks, monthly offline salons, and quarterly XCOPS&DAMS conferences—delivered by industry experts.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
