Implementing the HTTP Live Streaming (HLS) Protocol in Video Surveillance Systems


When we developed Viinex middleware, our goal was to utilize web technologies and ensure full HTML5 compatibility. To achieve this, we added support for the HLS (HTTP Live Streaming) video streaming protocol at the end of 2016, along with an embedded web server for media data access and Viinex management via HTTP RESTful API. This provided our first version of Viinex with its minimum basic functionality: video reception, recording, and HLS streaming.

Numerous articles, different information and opinions have been published on HLS since its introduction by Apple in 2009, as well as on newer protocols. In 2016, we faced a difficult decision on which video streaming protocol to use for the first release: HLS, known for its robustness, or MPEG DASH, which seemed easier to implement. We ultimately chose HLS and looking back, it was the right choice as it remains the most popular video streaming protocol according to our estimates and industry reports.

HLS offers several advantages, including compatibility with HTTP, ease of use for developers without the need for browser modification, and the ability to utilize common web services. The protocol is widely supported by platforms and browsers and can be played on almost any device, particularly those offered by Apple. It is reliable, time-tested, and easy to integrate into web services development.

Viinex uses the HLS protocol for both archived and live video streaming. To stream video, the video is divided into small segments or chunks, usually lasting a few seconds, with chunk duration being adjustable on the Viinex server.. When a user requests to watch the video, the HLS server sends a playlist file (in M3U8 format) that contains a list of URLs for each chunk in the video. The client (e.g. a web browser) downloads and plays each chunk in sequence, allowing for smooth playback of the video. The application in the user’s browser, which is obtained from the web server, regularly updates a list of video chunks in M3U8 format, in accordance with the HLS specification, providing the user with a continuously updated video stream. The diagram below illustrates how Viinex’s web server interacts with the client to deliver the video stream.

Viinex API allows for customization of video streams to meet the user’s needs, such as reducing frame rate or resolution using the built-in renderer object included in Viinex, or switching cameras to another ONVIF profile. Temporary copies of the video stream can be created with the specified settings, and can be deleted when no longer necessary, such as after a session with the user who requested the stream has ended.

The HLS protocol recommends that the media player buffer at least three video chunks before starting playback, resulting in a certain amount of latency during live video streaming. The length of each video chunk affects the latency, which can cause significant time delays in real-time video surveillance systems, such as when controlling a pan/tilt camera.

Minimizing the latency of HLS video streaming can be achieved by shortening the length of the video chunks, however, this approach carries the risk of potential video fragment loss. The length of each video chunk determines the duration available to download the subsequent chunk, and the more time there is, the greater the chance that the chunk will be downloaded successfully. If not, the affected chunk will not be displayed to the viewer. In networks with unstable bandwidth the probability of needing more time to download the next chunk than it takes to play the already downloaded chunks.

At Viinex, we successfully reduced latency to 2-3 seconds through experimenting with shorter chunk durations and playlist length. However, we don’t recommend these test settings, and it’s better to use Apple’s default settings. Additionally, the video source encoder also plays a role in HLS chunk definition by GOP (group of pictures). If key frames appear only once every 30 seconds, as with Axis cameras in zipstream mode, then the chunk will be at least 30 seconds long. This means that reducing HLS latency is only possible with specific camera settings.

HLS was initially developed to solve the problem of smooth playback through video buffering in the browser, making it well-suited for recorded video, such as movies or video clips. However, for real-time video streaming with minimal latency and smooth playback, a different standard, WebRTC, is more appropriate. Viinex offers this technology as well, but that’s a different discussion altogether.

HLS is still being developed, at WWDC 2019 an extension of the protocol was announced (a preliminary specification for Low Latency HLS) that allows for transmitting media data with low latency.

The integration of HLS with Viinex is straightforward, providing compatibility with client applications for both live video and video archive streaming, including functions like search, playback, pause, and stop. To utilize HLS with Viinex, all that is needed is to connect a camera (e.g. Cam 1) and create a basic HTML file in accordance with HTML5 specifications. For instance, a simple HTML file that displays the live video stream from Cam 1 could look like this:

<!DOCTYPE html>

<html>

<head>

<title>A simple example of HLS streaming based on Viinex Video Management SDK</title>

</head>

<body>

<video id=”livevideo” controls>

<!– IP address, port and path to the video stream –>

<source src=”http://127.0.0.1:8880/v1/svc/cam1/stream.m3u8″ />

</video>

<script src=”hls.min.js”></script>

<script>

var isAndroid = /(android)/i.test(navigator.userAgent);

function startLive(videoElementId, cameraName)

{

  if(Hls.isSupported()) {

    var video = document.getElementById(videoElementId);

    var hls = new Hls();

    hls.loadSource(‘http://127.0.0.1:8880/v1/svc/’+cameraName+’/stream.m3u8’);

    hls.attachMedia(video);

    hls.on(Hls.Events.MANIFEST_PARSED,function() {

      if(!isAndroid){

        video.controls = false;

      }

      video.play();

    });

  }

}

if(Hls.isSupported()) {

  startLive(“cam1”);

  document.write(“Playback via hls.js<br/>”);

}

else {

  document.write(“Native iOS HLS playback”);

}

</script>

</body>

</html>

This works even without JavaScript in a simpler Safari browser, but all the rest need a short JavaScript code for HLS playback.

Leave a Reply