Building a Real-Time Audio/Video Live Streaming Project with WebRTC and Swoole
This tutorial demonstrates how to combine Swoole's high‑performance PHP WebSocket server with WebRTC's browser‑based real‑time audio/video capabilities, covering server setup, client media capture, video segmentation with FFmpeg, and complete code examples to build a functional live‑streaming application.
With the continuous development of internet technologies, audio/video live streaming has become increasingly popular, leading developers to explore related solutions such as WebRTC and Swoole.
1. Setting up the Swoole server – Swoole is an open‑source high‑performance network communication framework for PHP. The following code creates a WebSocket server that listens on port 9501 and handles connection, message, and close events:
$server = new swoole_websocket_server("0.0.0.0", 9501);
$server->on("open", function (swoole_websocket_server $server, $request) {
echo "client " . $request->fd . " connected\n";
});
$server->on("message", function (swoole_websocket_server $server, $frame) {
echo "received message: " . $frame->data . "\n";
$server->push($frame->fd, "hello");
});
$server->on("close", function (swoole_websocket_server $server, $fd) {
echo "client {$fd} closed\n";
});
$server->start();Running this script starts the WebSocket server, which will later be used to exchange signaling data for WebRTC.
2. Real‑time audio/video communication with WebRTC – WebRTC provides browser‑to‑browser media streaming. The example below requests access to the user's camera and microphone and attaches the stream to a video element:
navigator.getUserMedia({audio: true, video: true}, function(stream) {
var video = document.querySelector('video');
video.srcObject = stream;
}, function(error) {
console.error(error);
});Combined with the Swoole WebSocket server for signaling, this enables real‑time peer‑to‑peer media exchange.
3. Video stream segmentation – To ensure smooth transmission, the media can be sliced into small segments using FFmpeg. The following command creates HLS fragments of five seconds each:
ffmpeg -i input.mp4 -c:v libx264 -c:a aac -f hls -hls_time 5 -hls_list_size 0 output.m3u8These segments can be sent through the WebSocket connection as they become available, allowing continuous playback on the client side.
4. Complete project code – The article provides the full set of server‑side and client‑side snippets shown above, which together constitute a functional audio/video live‑streaming application built with WebRTC and Swoole.
By following these steps, developers can quickly prototype a live‑streaming service and gain practical experience with both backend (Swoole) and frontend (WebRTC) real‑time communication technologies.
php中文网 Courses
php中文网's platform for the latest courses and technical articles, helping PHP learners advance quickly.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.