Build a Live Streaming Service with ZLMediaKit, FFmpeg, and Spring Boot
This guide walks you through installing ZLMediaKit, configuring FFmpeg, setting up a Spring Boot backend with process management, and creating a web player to stream and view video content using RTMP and HTTP-FLV protocols.
1. Environment preparation
1.1 ZLMediaKit
Pull and run the ZLMediaKit Docker image:
# Pull the image
docker pull zlmediakit/zlmediakit:master
# Run container
docker run -d \
--name zlm-server \
-p 1935:1935 \
-p 8099:80 \
-p 8554:554 \
-p 10000:10000 \
-p 10000:10000/udp \
-p 8000:8000/udp \
-v /docker-volumes/zlmediakit/conf/config.ini:/opt/media/conf/config.ini \
zlmediakit/zlmediakit:masterKey config.ini parameters for HLS:
[hls]
broadcastRecordTs=0 # 0 disables recording
deleteDelaySec=300 # Delete after 5 minutes
fileBufSize=65536
filePath=./www
segDur=2 # Segment duration (seconds)
segNum=1000 # Max number of .ts segments
segRetain=9999 # Segments retained on disk1.2 FFmpeg
Download FFmpeg from https://ffmpeg.org/download.html and add the ffmpeg binary to the system PATH.
2. Spring Boot backend
2.1 Maven dependency
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-exec</artifactId>
<version>1.3</version>
</dependency>2.2 Configuration properties
package com.lyk.plugflow.config;
import lombok.Data;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.stereotype.Component;
@Data
@Component
@ConfigurationProperties(prefix = "stream")
public class StreamConfig {
private String zlmHost; // ZLMediaKit address
private Integer rtmpPort; // RTMP port
private Integer httpPort; // HTTP‑FLV port
private String ffmpegPath; // Path to ffmpeg executable
private String videoPath; // Directory for uploaded videos
}2.3 Stream service
The service builds an FFmpeg command that reads a local video file and pushes it to ZLMediaKit via RTMP. It manages the process with commons‑exec, stores running executors in a concurrent map, and provides methods to start, stop, query playback URLs and check stream status.
package com.lyk.plugflow.service;
import com.lyk.plugflow.config.StreamConfig;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.exec.*;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
@Slf4j
@Service
public class StreamService {
@Autowired
private StreamConfig streamConfig;
private final Map<String, DefaultExecutor> streamProcesses = new ConcurrentHashMap<>();
private final Map<String, Boolean> manualStopFlags = new ConcurrentHashMap<>();
public boolean startStream(String videoPath, String streamKey) {
File videoFile = new File(videoPath);
if (!videoFile.exists()) {
log.error("Video file does not exist: {}", videoPath);
return false;
}
String rtmpUrl = String.format("rtmp://%s:%d/live/%s",
streamConfig.getZlmHost(), streamConfig.getRtmpPort(), streamKey);
CommandLine cmdLine = getCommandLine(videoPath, rtmpUrl);
DefaultExecutor executor = new DefaultExecutor();
executor.setExitValue(0);
ExecuteWatchdog watchdog = new ExecuteWatchdog(ExecuteWatchdog.INFINITE_TIMEOUT);
executor.setWatchdog(watchdog);
ByteArrayOutputStream output = new ByteArrayOutputStream();
executor.setStreamHandler(new PumpStreamHandler(output));
executor.execute(cmdLine, new ExecuteResultHandler() {
@Override
public void onProcessComplete(int exitValue) {
log.info("Streaming completed, key: {}, exit: {}", streamKey, exitValue);
streamProcesses.remove(streamKey);
}
@Override
public void onProcessFailed(ExecuteException e) {
boolean manual = manualStopFlags.remove(streamKey);
if (manual) {
log.info("Streaming manually stopped, key: {}", streamKey);
} else {
log.error("Streaming failed, key: {}, error: {}", streamKey, e.getMessage());
}
streamProcesses.remove(streamKey);
}
});
streamProcesses.put(streamKey, executor);
log.info("Started streaming, key: {}, rtmpUrl: {}", streamKey, rtmpUrl);
return true;
}
private CommandLine getCommandLine(String videoPath, String rtmpUrl) {
CommandLine cmd = new CommandLine(streamConfig.getFfmpegPath());
cmd.addArgument("-re");
cmd.addArgument("-i");
cmd.addArgument(videoPath);
cmd.addArgument("-c:v");
cmd.addArgument("libx264");
cmd.addArgument("-c:a");
cmd.addArgument("aac");
cmd.addArgument("-f");
cmd.addArgument("flv");
cmd.addArgument("-flvflags");
cmd.addArgument("no_duration_filesize");
cmd.addArgument(rtmpUrl);
return cmd;
}
public boolean stopStream(String streamKey) {
DefaultExecutor executor = streamProcesses.get(streamKey);
if (executor != null) {
manualStopFlags.put(streamKey, true);
ExecuteWatchdog watchdog = executor.getWatchdog();
if (watchdog != null) {
watchdog.destroyProcess();
}
streamProcesses.remove(streamKey);
log.info("Stopped streaming, key: {}", streamKey);
return true;
}
return false;
}
public String getPlayUrl(String streamKey, String protocol) {
return switch (protocol.toLowerCase()) {
case "flv" -> String.format("http://%s:%d/live/%s.live.flv",
streamConfig.getZlmHost(), streamConfig.getHttpPort(), streamKey);
case "hls" -> String.format("http://%s:%d/live/%s/hls.m3u8",
streamConfig.getZlmHost(), streamConfig.getHttpPort(), streamKey);
default -> null;
};
}
public boolean isStreaming(String streamKey) {
return streamProcesses.containsKey(streamKey);
}
}2.4 Application configuration (application.yml)
stream:
zlm-host: 192.168.159.129
rtmp-port: 1935
http-port: 8099
ffmpeg-path: ffmpeg
video-path: /videos/
spring:
servlet:
multipart:
max-file-size: 1GB
max-request-size: 1GB3. Usage
3.1 Streaming workflow
Start the ZLMediaKit container.
Upload video files to the directory defined by stream.video-path.
Call the startStream API with the absolute video path and a unique stream key.
The service creates an FFmpeg command and pushes the stream to ZLMediaKit via RTMP.
3.2 Playback workflow
Obtain a playback URL using getPlayUrl. Supported protocols are flv (HTTP‑FLV) and hls.
Use a browser player that supports FLV (e.g., flv.js) to view the live or recorded stream.
3.3 Manual FFmpeg test command
ffmpeg -re -i "sample.mp4" -c:v libx264 -preset ultrafast -tune zerolatency \
-c:a aac -ar 44100 -b:a 128k -f flv rtmp://192.168.159.129:1935/live/stream3.4 Minimal HTML player (flv.js)
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>FLV Live Player</title>
<script src="https://cdn.jsdelivr.net/npm/[email protected]/dist/flv.min.js"></script>
</head>
<body>
<video id="videoElement" controls muted style="width:100%;height:450px;background:#000;">
Your browser does not support video playback
</video>
<script>
const video = document.getElementById('videoElement');
const url = 'http://192.168.159.129:8099/live/stream.live.flv';
if (flvjs.isSupported()) {
const player = flvjs.createPlayer({type: 'flv', url, isLive: true});
player.attachMediaElement(video);
player.load();
player.play();
} else {
console.error('FLV not supported');
}
</script>
</body>
</html>This setup provides a complete end‑to‑end live‑streaming solution: ZLMediaKit handles ingestion and distribution, FFmpeg performs format conversion, Spring Boot orchestrates the process, and a lightweight HTML page with flv.js enables real‑time playback in browsers.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
Top Architect
Top Architect focuses on sharing practical architecture knowledge, covering enterprise, system, website, large‑scale distributed, and high‑availability architectures, plus architecture adjustments using internet technologies. We welcome idea‑driven, sharing‑oriented architects to exchange and learn together.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
