EvoStream Media Server Latency Overview
The latency of a video is the amount of delay that is incurred from reality to visible video playback. For example, you are watching yourself on a security camera feed. You wave your hand. The latency of the video is how long it takes for you to see yourself wave that hand in the video.
Many factors contribute to the latency of a video:
- The time it takes to create the video in the first place and encode it
- The time it takes the video to be put onto the network (TCP and IP network buffers)
- The time it takes the video to be read, translated, and put back on the network for consumption (EvoStream Media Server role)
- The time it takes to receive the video, potentially buffer it, decode it and display it to the screen (the player)
Each of these latency factors may be influenced to have either higher or lower latency. Some items that may influence each latency factor include:
- The amount of loss on the network
- The frame-rate of the video
- The configured receive or send buffer
EvoStream has run extensive tests to identify the latency not only of the EvoStream Media Server but also of common video sources and players. Understanding the latency of all of these components is critical to the design of any low-latency live-streaming platform.
Through testing the various combinations of stream sources and players EvoStream has been able to identify the typical (average) latencies of the various tools. For all of the various players and stream sources EvoStream has used the default settings and configurations. This has been done to reflect the most common user experience while also reducing the number of variables at play in these latency tests. The only exception is Flowplayer, where the inbound buffer was set to zero (0).