Asynchronous Cache Refresh Mechanism for Interface Optimization Using Redis and RabbitMQ

This article describes how combining Redis and RabbitMQ to implement an asynchronous cache refresh mechanism significantly improves interface performance and stability for a high‑traffic used‑car platform, meeting strict tp99 latency requirements while maintaining data freshness.

HomeTech
HomeTech
HomeTech
Asynchronous Cache Refresh Mechanism for Interface Optimization Using Redis and RabbitMQ

1. Background and Results

With the rapid development of the Internet and communication devices, users demand faster page and app loading speeds. When third‑party APIs are unstable, performance suffers. In a used‑car business with many external API calls, we devised an asynchronous cache refresh solution using Redis and RabbitMQ, greatly improving system performance, stability, and meeting tp99 latency targets.

2. Solution Iteration

We first accessed APIs directly without caching. As traffic grew, we added a local cache, which introduced inconsistency across servers. Next we moved to a distributed Redis cache, solving consistency but raising data‑freshness concerns for short‑lived data. Finally, we combined Redis with RabbitMQ to create an asynchronous cache refresh mechanism that balances performance and real‑time data needs.

Early execution flow diagram
Early execution flow diagram

3. Asynchronous Cache Refresh Technical Solution

The architecture consists of user clients, API services, Redis, RabbitMQ, cacheable APIs, data sources, and logging. We built a dedicated microservice for cache handling, provisioned Redis and RabbitMQ capacity, defined a unified cache key rule, and created a separate consumer service to update Redis asynchronously.

Execution flow:

When a client requests an API, the service first checks Redis. If a valid cache exists, it returns the data and, if the cache age exceeds a threshold, pushes a refresh message to RabbitMQ.

If the cache miss occurs, the service calls the downstream API, stores the result in Redis with a timestamp.

A RabbitMQ consumer continuously processes refresh messages, re‑fetches the source API, and overwrites the Redis entry, ensuring data freshness without blocking the client request.

Execution flow diagram
Execution flow diagram

4. Data Support

Performance tests on high‑traffic interfaces show that the asynchronous refresh approach with a 10‑minute Redis TTL achieves similar latency to a 4‑hour static cache, while preserving near‑real‑time data, outperforming a short‑TTL static cache.

Redis Cache=4h
Redis Cache=4h
Redis Cache=10min
Redis Cache=10min
RabbitMQ + Redis Cache=10min
RabbitMQ + Redis Cache=10min

5. Summary

The asynchronous cache refresh mechanism dramatically improves interface performance, reduces reliance on unstable third‑party APIs, and satisfies strict tp99 latency requirements while maintaining data freshness, especially for APIs that can only tolerate short cache lifetimes.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

backendPerformanceCacheAsynchronousRabbitMQ
HomeTech
Written by

HomeTech

HomeTech tech sharing

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.