Infovista | Testing native OTT video streaming applications

WHITEPAPER

Main testing challenges While evolving in complexity and sophistication to enable better quality, OTT video streaming applications come with additional levels of encryption of the delivery protocols (e.g., QUIC) as well as with non-standardized, proprietary OTT video codecs and clients. In addition, OTT applications are continuously and dynamically changing to improve both the user experience and the protection of the video content itself from piracy with sophisticated encryption schemes. This kind of closeness and lack of transparency makes it difficult to develop and deploy testing solutions which require access to the video stream to determine user-perceived quality metrics. The level of difficulty is also dependent on the device’s Operating System (OS). 5G networks created a favorable ecosystem for the CDNs to offer a multitude of OTT video streaming applications. The resulting variety and diversity come with different protocols, platforms and device operating systems, and non-standardized proprietary codecs/clients. Consequently, they all bring differing expected performance which needs to be tested. Due to the desire to continuously increase the quality perceived by users, each OTT application runs continuous software version updates. All these add levels of complexity for testing. To cope with these challenges, the ETSI STQ- Mobile group developed and released TR 101.578 and TR 103.488 which offer guidance for testing OTT video streaming applications. The set of defined KPIs as well as their measurement refer to OTT video streaming access and retainability as well as the video streaming presentation quality during playback (see Table 1). However, when it comes to the latter, ETSI recommends describing it through several KPIs (Table 1), rather than a single QoE/MOS score.

ITU-T Study Group 12 also undertook extensive research to develop a series of models which are designed to estimate a user’s subjective opinion (QoE/MOS) on video streaming playback quality 2 . All these solutions, although accurate, have serious drawbacks when it comes to being implemented in a drive testing solution and/or on-device testing. Firstly, most of these solutions require information elements embedded in the video bit stream as input parameters (e.g. knowledge of I and P video frame), which is generally highly encrypted. Secondly, even those solutions which rely only on transport parameters, which are easier to get, are generally trained to work for one application and for limited resolutions. Thirdly, the provided video QoE scores correspond to a minimum measurement granularity of 4-6 secs (continuous scoring) and about 60sec or more for per video streaming session scoring. The measurement granularity of 4-6 sec., although not optimal, could work to reflect network behavior in most drive test scenarios. However, the 60 sec. per session scoring is less meaningful for drive testing since possible network problems could be hidden and/or smoothed out. Therefore, while ETSI provides exact guidance for determining the perceived waiting time and perceived retainability of video streaming sessions, the perceived video presentation (playback) quality remains largely with ETSI and ITU-T SG12 solutions neither optimal for drive testing scenarios nor for the variety of OTT video streaming applications.

2. ITU-T P.1203.x, ITU-T P.1204.x; x=1-5

6

Powered by