Most of the millions of hotel bookings made each year are done online. Despite the proliferation of hotel aggregator websites like Expedia and Trivago, most people book directly on the hotel websites.
With such a high concentration of travelers booking directly on hotel websites, having a good user experience is a differentiator. If travelers can’t find needed information about a hotel or room, can’t make a reservation easily, or don’t understand fees, they might call the hotel reservation line (a cost), go elsewhere to book (a loss), and even tell friends and colleagues about the poor experience (a detractor).
It’s become increasingly important for a hotel to benchmark the user experience of its website and make changes based on the findings. To help provide context, we provide regular industry benchmarks as part of the SUPR-Q.
Benchmarking the Hotel Website Experience
To understand the quality of the online experience, we collected UX benchmark metrics on five popular hotel websites.
- Best Western
- Holiday Inn
A good benchmark indicates where a website falls relative to the competition and is an essential step to understanding how any design changes contribute to a quantifiable improvement, which ultimately leads to an increase in website revenue.
A website UX benchmark consists of a tiered approach, as shown in Figure 1.
Figure 1 shows two things. First, overall website attitudes are affected by task metrics, which in turn are affected by interactions on the website. Second, website attitudes are affected by other variables outside the task experience (brand, prior experiences, etc.). All three levels are important to collect. This benchmarking study collects data on the outer and inner rings to provide both a good comparison set of attitudinal metrics and insights on what to improve from task metrics and website interactions.
We conducted two studies. First, we had 405 participants who had recently visited or booked reservations on one of the hotel websites reflect on their most recent experiences. Second, another 160 participants who had recently made a reservation at any hotel website were asked to complete two tasks on one of the five websites (Best Western, Hilton, Hyatt, or Marriot).
The data was collected in June 2017. Participants in the studies answered the 8-item SUPR-Q (including the Net Promoter Score), as well as reflected on the functionality that contributes to a successful website experience, including: selecting dates, choosing room options, and understanding the total costs of a stay (including fees). More details on the study are available in the full report; here are the highlights of what we found.
Reasons For Visiting
Most participants in the study visited the hotel websites they rated at least a few times per year, showing that customer loyalty plays an important role in the success of these brands. On average, a participant spent between $250 to $350 during a last hotel stay.
Not surprisingly, participants in the study reported mostly booking (31% of the time) or browsing rooms (28% of the time) on the most recent visit to the websites. Around 4% of respondents reported visiting hotel websites while staying at the hotels itself. As we begin to benchmark the mobile user experience we anticipate more people will access hotel apps and websites, which are beginning to offer more digital services.
Quality of the User Experience: SUPR-Q
The SUPR-Q is a standardized measure of the quality of a website’s user experience and is a good way to gauge website attitudes (the outer layer in Figure 1). It’s based on a rolling database of around 150 websites. Scores are percentile ranks and tell you how a website experience ranks relative to the other websites. The SUPR-Q provides an overall score as well as detailed scores for subdimensions of trust, usability, appearance, and loyalty.
In general, the hotel websites in this study scored above average across all dimensions with the industry average SUPR-Q at the 76th percentile (scoring better than 76% of the websites in the database). Best Western had the lowest SUPR-Q score of the group with a score at the 65th percentile and Marriott led the group with a score at the 85th percentile.
Best Western had the highest usability score (at the 74th percentile) compared to Hilton, with the lowest at the 63th percentile. The usability factor on the SUPR-Q predicts a SUS score; in the case of Best Western, it’s a SUS equivalent score of 77. We can dig into a bit of the “why” behind these usability scores by examining the behavioral data that MUIQ collects and the verbatim comments.
An examination of users’ behavior on the Best Western website (for the usability portion of this study) reveals there’s still room for improvement despite the above average usability score for recent users of the website. The main call to action, the Book button, is below the fold (offscreen, where a user needs to scroll down to find it), which gave some participants trouble (Figure 2). Instead, users clicked the Update button, expecting to proceed with the reservation. This contributed to participants in the usability portion of the benchmark study giving it the second- lowest usability scores of the group.
Hilton’s website (with the lowest usability score of the group for recent users of the website) breaks one of the older conventions in website navigation—the logo leading back to the home page. In Hilton’s case, when users clicked the promotion on the hero image (where it says “Make it a 3 Day Weekend”), users were taken to its loyalty program website with no obvious way to get back to the reservation page.
This likely led to the perception of difficult navigation, which is reflected in the comment of one participant in the usability study:
“I found it difficult to navigate. I want to know where to go quickly to get the information I’m seeking rather than guess through a bunch of tabs.”
Participants also struggled making reservations on the Marriott home page. One participant mentioned “I had a hard time selecting the dates that I wanted to search for.” In examining the click path data in MUIQ, we noticed a higher number of participants clicked non-clickable (NCs) elements of the home page and site. The blue NCs in Figure 4 show the average number of clicks on non-clickable elements (7 and 13 on some pages); the yellow-highlighted areas show when search was used in the process of finding the hotel.
Loyalty/Net Promoter Scores
The Net Promoter Score (NPS) continues to be a popular measure of loyalty. For this group, the average NPS was 13%. Marriott had the highest NPS of 25% and Best Western had the lowest NPS of -3% (slightly more detractors than promoters). Loyalty is highly affected by prior experience; but in this sample, participants actually had slightly less experience with Marriott, suggesting a strong positive effect from brand might be lifting the NPS (more details on the sample are available in the report).
Key Drivers of UX Quality (SUPR-Q)
The SUPR-Q provides a broader measure of the overall website user experience (loosely referred to as UX quality). The SUPR-Q, like most standardized measures, is not meant to be diagnostic. To understand what elements in the experience are likely affecting SUPR-Q scores, we conducted a key driver analysis on more specific “components” of the hotel website experience.
Participants responded to 10 “component” items that asked about shopping and purchasing using a 5-point Likert scale, and one brand-attitude item on a 7-point scale. Of the 11 items, 7 are key drivers (denoted with an * below) of the SUPR-Q and explain 66% of the variability in scores. How each website scored on each component is also available in the report.
- Everything included in the hotel’s rates was clearly communicated.*
- Rates were accurately and fully communicated.*
- The hotel’s rates are a good value.*
- The booking calendar was easy to use and understand. *
- It was easy to compare different reservation options.*
- Room options were available and clearly described.*
- Brand favorability.*
- It was easy to enter my billing information.
- I am confident that my reservation was accurate.
- The website clearly confirmed my reservation total and contents.
- I knew where I was in the booking processes the entire time.
The graph in Figure 5 shows the percentage of SUPR-Q scores accounted for by each key driver.
For example, the booking calendar being easy to use and understand explains 10% of the SUPR-Q scores—which is twice as important as the rates being perceived as a good value, and also explains 5% of the variation in SUPR-Q scores. It’s interesting that this basic element (the calendar) continues to remain an important part of a participant’s attitude toward a website experience.
The 34% “unexplained” proportion of variance reflects aspects that are unaccounted for with these 7 key drivers, and the inevitable variability and measurement error from collecting attitudinal data. However, explaining 66% of the variation in an attitude with just 7 items is excellent for the behavioral sciences.
Key Drivers by Website
We also conducted a key driver analysis for each individual hotel website to identify the strengths and weaknesses of each. The full drivers by website are available in the report. Some of the findings were:
Participants’ attitudes toward Hilton’s rates were a net drag on the SUPR-Q. The rates being clearly communicated and the perception of the value of the rates were both rated below average. For example, two participants said
“Difficult to see how changing some aspects of a reservation would affect price.”
“Problem being able to compare room rates.”
Marriott has a strong brand reputation and that has a net positive lift on SUPR-Q scores. The brand falls at the 70th percentile—the highest in the group. In contrast, Best Western’s brand favorability was also a key driver but has a more muted effect with it scoring at the 54th percentile (an average brand favorability score).
Learn More: UX Measurement Boot Camp
Intensive Training on UX Methods, Metrics and Measurement
|Fall 2020: Delivered Online|