Implementing Two-Level Caching with Spring Cache, Caffeine, and Redis
This article explains how to replace manual cache handling with Spring Cache annotations, introduces the concepts of L1 (Caffeine) and L2 (Redis) caches, discusses design considerations for consistency, null handling, eviction, and provides practical Maven and configuration examples for Java backend applications.
Hello everyone, I'm Chen~
Before learning Spring Cache, many developers hard‑code cache logic. The article starts with a typical manual cache example for user data retrieval, showing how cache keys are defined, Redis is queried first, and database fallback occurs, followed by storing results back into Redis.
@Autowire
private UserMapper userMapper;
@Autowire
private RedisCache redisCache;
// Query user
public User getUserById(Long userId) {
// Define cache key
String cacheKey = "userId_" + userId;
// Query Redis first
User user = redisCache.get(cacheKey);
if (user != null) {
return user;
}
// Fallback to DB
user = userMapper.getUserById(userId);
// Store in cache if not null
if (user != null) {
stringCommand.set(cacheKey, user);
}
return user;
}The manual approach leads to repetitive code for the four typical cache actions (store, read, update, delete) and tightly couples cache operations with business logic, making it hard to disable caching during development or switch cache providers.
Using Spring Cache annotations simplifies this. By annotating the mapper interface:
@Mapper
public interface UserMapper {
/**
* Get user by ID, using cache if available.
*/
@Cacheable(key = "'cache_user_id_' + #userId")
User getUserById(Long userId);
}And the service implementation becomes a thin delegate:
@Autowire
private UserMapper userMapper;
public User getUserById(Long userId) {
return userMapper.getUserById(userId);
}Spring Cache provides two core interfaces, Cache and CacheManager , which abstract cache operations and cache creation respectively.
Cache Interface
Defines methods such as get , put , evict , and clear , along with a ValueWrapper to handle null values.
package org.springframework.cache;
import java.util.concurrent.Callable;
public interface Cache {
String getName();
Object getNativeCache();
ValueWrapper get(Object key);
T get(Object key, Class
type);
T get(Object key, Callable
valueLoader);
void put(Object key, Object value);
ValueWrapper putIfAbsent(Object key, Object value);
void evict(Object key);
void clear();
interface ValueWrapper { Object get(); }
}CacheManager Interface
Creates and retrieves named caches.
public interface CacheManager {
Cache getCache(String name);
Collection
getCacheNames();
}Common annotations include @Cacheable (read), @CachePut (write), @CacheEvict (delete), @Caching (multiple), and @EnableCaching (global switch).
Two‑Level Cache Considerations
Using Redis as a remote L2 cache reduces DB load, but network latency can still be significant. Introducing an in‑process L1 cache (Caffeine) can further improve performance for hot keys.
Key challenges for L1 caches in a distributed environment include consistency across nodes, handling null values to avoid cache penetration, cache warm‑up, size limits, eviction policies, and expiration handling.
Caffeine offers high‑performance caching with configurable write strategies (manual, sync, async) and eviction policies based on size, time, or reference (weak/soft). It uses a hybrid W‑TinyLFU algorithm for optimal hit rates.
Sample Maven Dependency
<dependency>
<groupId>com.jincou</groupId>
<artifactId>redis-caffeine-cache-starter</artifactId>
<version>1.0.0</version>
</dependency>application.yml Configuration
# Two‑level cache configuration
l2cache:
config:
allowNullValues: true
composite:
l1AllOpen: false
l1Manual: true
l1ManualKeySet:
- userCache:user01
- userCache:user02
- userCache:user03
l1ManualCacheNameSet:
- userCache
- goodsCache
caffeine:
autoRefreshExpireCache: false
refreshPoolSize: 2
refreshPeriod: 10
expireAfterWrite: 180
expireAfterAccess: 180
initialCapacity: 1000
maximumSize: 3000
redis:
defaultExpiration: 300000
expires:
{userCache: 300000, goodsCache: 50000}
topic: cache:redis:caffeine:topicEnable Caching in Spring Boot
@EnableCaching
@SpringBootApplication
public class CacheApplication {
public static void main(String[] args) {
SpringApplication.run(CacheApplication.class, args);
}
}Cacheable Service Example
@Service
public class CaffeineCacheService {
private final Logger logger = LoggerFactory.getLogger(CaffeineCacheService.class);
private static Map
userMap = new HashMap<>();
{
userMap.put("user01", new UserDTO("1", "张三"));
userMap.put("user02", new UserDTO("2", "李四"));
userMap.put("user03", new UserDTO("3", "王五"));
userMap.put("user04", new UserDTO("4", "赵六"));
}
@Cacheable(key = "'cache_user_id_' + #userId", value = "userCache")
public UserDTO queryUser(String userId) {
UserDTO userDTO = userMap.get(userId);
try { Thread.sleep(1000); } catch (InterruptedException e) { e.printStackTrace(); }
logger.info("加载数据:{}", userDTO);
return userDTO;
}
@Cacheable(value = "userCache", key = "#userId", sync = true)
public List
queryUserSyncList(String userId) {
UserDTO userDTO = userMap.get(userId);
List
list = new ArrayList<>();
list.add(userDTO);
logger.info("加载数据:{}", list);
return list;
}
@CachePut(value = "userCache", key = "#userId")
public UserDTO putUser(String userId, UserDTO userDTO) { return userDTO; }
@CacheEvict(value = "userCache", key = "#userId")
public String evictUserSync(String userId) { return userId; }
}The source code is available at GitHub . Additional popular two‑level cache projects such as Alibaba JetCache, J2Cache, and l2cache are also listed.
Overall, the article demonstrates how Spring Cache abstracts cache operations, how Caffeine provides an efficient in‑process L1 cache, and how Redis serves as a durable L2 cache, enabling developers to build scalable, low‑latency Java backend services.
Code Ape Tech Column
Former Ant Group P8 engineer, pure technologist, sharing full‑stack Java, job interview and career advice through a column. Site: java-family.cn
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.