Optimizing Animated Image Loading on iOS: Practices and Performance Comparison
The article explains how iOS’s lack of native GIF/WebP support leads to crashes when decoding all frames, and describes an optimized per‑frame loading component (QMAnimatedImageView) that uses CADisplayLink, NSCache, downsampling and memory‑limit tuning to dramatically reduce CPU, memory usage and jank while preserving smooth animation.
GIF and Animated WebP are the most common animated image formats on the Internet, but iOS's native UIImage does not directly support them. Various third‑party libraries such as SDWebImage and YYImage are used to fill this gap. This article, based on the QQ Music iOS client, introduces different solutions, evaluates their pros and cons, and presents an optimized implementation.
1. Problems and Optimization Results
Frequent crashes occurred when loading animated images because the app decoded many frames on a background thread, quickly exhausting memory and triggering NSMallocException or OOM. After a two‑month gray‑release of a per‑frame decoding scheme wrapped in a generic component QMAnimatedImageView, the following improvements were observed:
Resolved crashes caused by OOM, NSMallocException, and high CPU load.
First‑frame loading time remained unchanged despite per‑frame decoding.
Image memory hit rate increased from 65% to 76%.
Compared with YYAnimatedImageView and SDAnimatedImageView, CPU and memory usage were lower and smoothness was better.
The component supports GIF, Animated WebP, and APNG, and can reuse static images.
2. iOS Animated Image Display Methods
2.1 Using ImageIO.framework
Loading a GIF with [UIImage imageNamed:] or [UIImage imageWithData:] only returns a static image. To animate, ImageIO.framework parses each frame from the data and assigns the frames to UIImageView.animationImages. Example code:
<span>// 1. Create CGImageSourceRef</span>
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
int count = CGImageSourceGetCount(source);
NSMutableArray *images = [NSMutableArray array];
for (int i = 0; i < count; i++) {
CGImageRef image = CGImageSourceCreateImageAtIndex(source, i, NULL);
[images addObject:[UIImage imageWithCGImage:image scale:[UIScreen mainScreen].scale orientation:UIImageOrientationUp]];
CGImageRelease(image);
}
CFRelease(source);
UIImageView *imageView = [[UIImageView alloc] init];
imageView.animationImages = images;
imageView.animationDuration = 0.1 * count;
[imageView startAnimating];2.2 FLAnimatedImage
Flipboard’s early library follows a producer‑consumer model: FLAnimatedImageView (consumer) displays frames driven by CADisplayLink, while FLAnimatedImage (producer) decodes frames asynchronously using CGImageSourceCreateImageAtIndex. The library is now considered outdated compared with YYAnimatedImageView.
2.3 YYAnimatedImageView Implementation and Limitations
YYAnimatedImageView uses YYImageCoder for multi‑format decoding and employs semaphores for frame reading and asynchronous decoding. It caches frames in an NSDictionary and uses CADisplayLink for animation. However, NSDictionary can be compressed by the system under memory pressure, causing extra CPU overhead, and the view directly holds the frame cache, leading to repeated decoding during fast scrolling.
2.4 SDWebImage Versions Overview
SDWebImage integrates FLAnimatedImageView for GIFs in early versions, later uses [UIImage animatedImageWithImages:duration:] to create a private _UIAnimatedImage. SDWebImage 4 calculates the greatest common divisor of frame durations to avoid distortion. SDWebImage 5 introduces SDAnimatedImageView and the SDWebImageMatchAnimatedImageClass option for loading animated images:
SDAnimatedImageView *imageView = [SDAnimatedImageView new];
[imageView sd_setImageWithURL:[NSURL URLWithString:url]
placeholderImage:nil
options:SDWebImageMatchAnimatedImageClass];Note that the loaded image is cached as an SDAnimatedImage object; using a regular UIImageView with the same URL will display only the first frame.
3. QQ Music iOS Animated Image Loading Strategy and Issues
The existing pipeline, derived from SDWebImage, suffers from three main problems:
First‑frame loading is slow because all frames are decoded before display.
Uniform frame duration causes animation distortion.
Background decoding of all frames can cause memory spikes and crashes on low‑end devices.
The solution is a per‑frame loading approach: decode only the first frame initially, use CADisplayLink to trigger decoding of the next needed frame, and cache decoded frames asynchronously.
4. Optimization Practices
4.1 Per‑frame Decoding
Implementation steps:
Wrap the image in a QMAnimatedWebImage subclass that decodes only the first frame.
Display it with QMAnimatedImageView, a UIImageView subclass.
Use CADisplayLink inside the view to drive frame presentation.
Run an asynchronous task queue to decode upcoming frames and store them in the view’s cache.
Results show a ~50% reduction in first‑frame latency and no impact on overall performance.
4.2 Replacing NSDictionary with NSCache for Frame Cache
Switching to NSCache allows the system to automatically evict frames based on cost, reducing CPU overhead caused by system‑initiated compression of NSDictionary.
4.3 Decoupling View and Frame Cache
Instead of the view holding the cache, QMAnimatedWebImage stores frames. When the image is removed from SDImageCache, the cache is cleared, preventing repeated decoding during cell reuse.
4.4 Downsampling
Large animated images are downsampled to the display size using the WWDC “Image and Graphics Best Practices” code:
func downsample(imageAt imageURL: URL, to pointSize: CGSize, scale: CGFloat) -> UIImage {
let imageSourceOptions = [kCGImageSourceShouldCache: false] as CFDictionary
let imageSource = CGImageSourceCreateWithURL(imageURL as CFURL, imageSourceOptions)!
let maxDimensionInPixels = max(pointSize.width, pointSize.height) * scale
let downsampleOptions = [kCGImageSourceCreateThumbnailFromImageAlways: true,
kCGImageSourceShouldCacheImmediately: true,
kCGImageSourceCreateThumbnailWithTransform: true,
kCGImageSourceThumbnailMaxPixelSize: maxDimensionInPixels] as CFDictionary
let downsampledImage = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, downsampleOptions)!
return UIImage(cgImage: downsampledImage)
}4.5 Memory‑limit Calculation and Cache Size Tuning
The app computes the process memory limit (iOS 13+ uses os_proc_available_memory(), otherwise falls back to mach_host_self()) and sets SDImageCache.maxMemoryCost to 30% of the available memory, which smooths memory usage and reduces OOM crashes.
// Example method to get memory limit (bytes)
- (int64_t)memoryUsageLimitByByte {
int64_t memoryLimit = 0;
if (@available(iOS 13.0, *)) {
task_vm_info_data_t vmInfo;
mach_msg_type_number_t count = TASK_VM_INFO_COUNT;
kern_return_t kr = task_info(mach_task_self(), TASK_VM_INFO, (task_info_t)&vmInfo, &count);
if (kr == KERN_SUCCESS) {
int64_t memoryCanBeUse = (int64_t)os_proc_available_memory();
if (memoryCanBeUse > 0) {
int64_t memoryUsed = (int64_t)vmInfo.phys_footprint;
memoryLimit = memoryUsed + memoryCanBeUse;
}
}
}
if (memoryLimit <= 0) {
NSLog(@"Fallback to physical memory");
memoryLimit = [NSProcessInfo processInfo].physicalMemory * 0.55;
}
return memoryLimit;
}4.6 Skipping Decoding During Fast Scrolling
When the UI is scrolling quickly, the component can suspend frame‑decoding tasks, reducing CPU load.
4.7 Unified Image Loading Component
The final component automatically detects whether a URL points to a static image or an animated one (GIF, WebP, APNG) and applies per‑frame loading when needed, providing a single API for both cases.
5. Comparison with Other Open‑Source Solutions
A stress test with ~200 animated images on an iPhone 7 Plus showed the following metrics:
Metric
UIImageView
SDAnimatedImageView
YYAnimatedImageView
QMAnimatedImageView
Memory Peak
1.9 GB
1.1 GB
1.3 GB
0.8 GB
Memory‑pressure events/min
12
0.67
0
0
Average FPS
52
36
57
58.3
Jank count/10 min
38
107
23
0
Severe jank count/10 min
25
9.8
15
0
Jank duration %
12%
1.9%
0.9%
0%
CPU load
48%
43%
81%
27%
Crashes
Crash within 5‑40 s
No crash
Crash after 1‑2 min
No crash
Key takeaways:
Plain UIImageView cannot handle heavy animated‑image streams. SDAnimatedImageView reduces CPU but still suffers from low FPS and noticeable jank. YYAnimatedImageView has high memory and CPU usage, leading to crashes. QMAnimatedImageView achieves the best balance: no jank, low CPU, controlled memory, and no crashes.
6. Summary of Optimizations
Adopt per‑frame loading to avoid decoding all frames up‑front.
Separate frame cache from the view to prevent repeated decoding during fast scrolling.
Limit SDImageCache memory usage to avoid OOM under high CPU load.
Replace NSDictionary with NSCache for frame caching.
Release cache proactively when decoding fails due to memory shortage.
Enable downsampling to match display size and save memory.
Pause decoding tasks while the UI is scrolling quickly.
Provide a unified image loading component that works for both static and animated images.
7. References
[1] iOS Memory Deep Dive – https://developer.apple.com/videos/play/wwdc2018/416/
[2] Image and Graphics Best Practices – https://developer.apple.com/videos/play/wwdc2018/219/
[3] APP&游戏需要关注Jank卡顿及卡顿率吗 – https://bbs.perfdog.qq.com/article-detail.html?id=6
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
Tencent Music Tech Team
Public account of Tencent Music's development team, focusing on technology sharing and communication.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
