{"id":390,"date":"2017-01-25T05:28:52","date_gmt":"2017-01-25T05:28:52","guid":{"rendered":"http:\/\/measuringu.com\/entertainment-benchmarks\/"},"modified":"2021-08-10T08:23:57","modified_gmt":"2021-08-10T14:23:57","slug":"entertainment-benchmarks","status":"publish","type":"post","link":"https:\/\/measuringu.com\/entertainment-benchmarks\/","title":{"rendered":"UX & Net Promoter Benchmarks for Entertainment Websites"},"content":{"rendered":"
Content is king. Whether it\u2019s for books, movies, audio books, news sites, or entertainment websites. When you have good content people will come and stay. But if people can\u2019t find the content or there\u2019s too much friction in the experience you\u2019ll likely lose your audience even with killer content.<\/p>\n
An increasing number of consumers now subscribe to a premium content service like Netflix, Hulu, or HBO GO (often called OTT<\/a>). While content is delivered via a TV, mobile device, or website, one of the first touchpoints for these services is often a website.<\/p>\n To understand the quality of the online experience, we examined the websites of five of the more popular entertainment services and collected benchmark metrics to quantify the experience.<\/p>\n A good benchmark tells you where a website falls relative to the competition and is an essential step in understanding how any design changes contribute to a quantifiable improvement. A website UX benchmark consists of a tiered approach, as shown in Figure 1.<\/p>\n <\/p>\n <\/p>\n Figure 1<\/strong>: Hierarchy and relationship of data collected in UX benchmark studies.<\/p>\n Figure 1 shows two things. First, overall website attitudes are affected by task metrics, which in turn are affected by interactions on the website. Second, website attitudes are affected by other variables outside the task experience (brand, prior experiences, etc.). All three levels are important to collect.<\/p>\n We had 227 participants attempt a task on one of the five entertainment websites: Vudu, HBO GO, Amazon Instant Video, YouTube, and Netflix. Participants were asked to find an action movie, rated PG-13 with at least a 4 star rating. The task was open-ended enough to allow for self-directed exploration, but had specific criteria to allow participants to interact with filters, search features, and recommendation capabilities.<\/p>\n Participants completed tasks using MU-IQ, our platform for recording task-based survey questions. MU-IQ records participants\u2019 screens and collects a plethora of data on where and how long participants spent on pages across the websites. We looked at a subset of the videos to examine the root causes of the metrics.<\/p>\n More details on the study are available in the report<\/a>; here are the highlights of what we found.<\/p>\n The SUPR-Q is a standardized measure of the quality of a website’s user experience and is a good way to gauge website attitudes (the outer layer in Figure 1). It\u2019s based on a database of 200 websites, so scores are percentile ranks and tell you how a website experience ranks relative to the other websites. The SUPR-Q provides an overall score as well as detailed scores for sub-dimensions of usability, trust, appearance, and loyalty.<\/p>\n In general, the entertainment websites in this study scored high across all dimensions. Both Netflix and YouTube led the pack and scored above the 95th<\/sup> percentile. HBO GO scored the lowest of the group with a SUPR-Q score of 71%.<\/p>\n This shows that Netflix has resumed its high place in our database since its temporary fall in 2011<\/a> after splitting the business and website.<\/p>\n The Net Promoter Score (NPS) continues to be a popular measure of loyalty. YouTube rose above the rest with an NPS of 65%, followed by Netflix with an NPS of 37%. HBO GO and Vudu had NPS scores that were less than half the leaders.<\/p>\n The SUPR-Q and Net Promoter Score provide metrics about the overall experience. They aren\u2019t meant to be diagnostic. To understand what\u2019s driving these higher experience scores we needed to examine the more detailed task metrics and the behaviors we observed (the lower levels in Figure 1).<\/p>\nBenchmarking the Website Experience<\/h2>\n
The Study<\/h2>\n
Overall User Experience: SUPR-Q<\/h3>\n
Net Promoter Score<\/h3>\n
Task Details<\/h3>\n
Task Ease and Success<\/h3>\n