The User Experience of Airline & Aggregator Websites

Jeff Sauro, PhD

Most of us take to the skies for business or a vacation (some more frequently than others).

And who hasn’t had a nightmare travel experience with delays, cancellations or getting bumped?

While there is much written about the airline travel experience, a lot of time is also spent researching and purchasing those airline tickets online. We wanted to get a sense of what’s working and what’s not across the major US carriers’ websites and the popular aggregator websites.

In May of 2014 we asked 1,023 users (57% female) to reflect on their most recent online experience on one of 9 US based airline websites or travel aggregator websites. In conjunction with our panel partner, Op4G, we had participants answer questions about researching or purchasing airfare and investigated factors including usability, loyalty, trust, and the main problems travelers encountered. We limited our research to just airline researching or purchasing and not hotel or rental car purchases (which aggregator websites support more prominently than airlines).

To complement our analysis of past experiences, we also recruited another 130 participants to attempt to make a reservation on the three largest US carrier websites (Southwest, United and American). We integrated these more qualitative findings into the larger retrospective analysis. We are reviewing the results and methods on a webinar on June 18, 2014 at 11am pacific.

Here’s a highlight of the results.

The US based airline and aggregator websites include:

  • American
  • United
  • Delta
  • Southwest
  • Travelocity
  • Kayak
  • Priceline
  • Orbitz
  • Expedia

Each website had at least 100 responses. More details on the study are available in our report; here are some of the highlights.

Visit Frequency

In general, participants visit aggregators more frequently than they do airline websites. Many participants reported visiting the sites a few times a month or more, with more aggregators getting more frequent visits (50% at least monthly) compared to airlines (41% at least monthly). Orbitz had the highest visit frequency with 58% of respondents reporting visiting once a month or more. United had the least frequent visits with 38% reporting visiting once a month a more.

What People Do On Airline & Aggregator Websites

Most of the respondents (67%) reported using the websites for personal travel a few times per year compared to 28% reporting using the website for business travel a few times per year.

Like our examination of retail websites, the tasks users were performing weren’t too surprising. In general, most users were looking up prices (56%) or purchasing a ticket (29%). Participants are about 27% more likely to research ticket prices on aggregators compared to airline websites (56% vs 44% respectively). This suggests a pattern of using aggregators to price tickets and then completing the purchase on the airlines websites to avoid fees, select premium seats or ensure frequent flyer credit.

Users of American Airlines had the highest share of users looking at their frequent flyer miles–12% compared to about 5% on the other airline sites.

Task performance was generally high across all websites (> 96% success rate). Delta and American had slightly (although not statistically) lower success rate for purchasing tickets (91% and 92%). Kayak users reported the lowest success rate for finding ticket pricing 91% of the time.

Overall Quality of the User Experience

To assess the overall quality of the online user experience we used the Standardized User Experience Percentile Rank Questionnaire (SUPR-Q). The questionnaire consists of 8 items and contains a normalized database of 200 websites. Scores are expressed as percentile ranks, so a score of 50% means the website has an average score (half have higher scores and half have lower scores). Higher scores indicate a higher quality experience. The SUPR-Q has an overall score and four dimensional scores of usability, appearance, trust and loyalty.

As a group, the airline websites average score is at the 83rd percentile. United was the only website that was statistically lower than the industry average at the 65th percentile. There was no statistical difference between aggregators and airline websites.

Southwest led the group with an overall score of 91%, largely driven by higher loyalty and trust scores. Orbitz had a comparable score of 90% driven by usability and loyalty scores. While these scores reflect the website experience, they do loosely follow the rankings of other airline customer experience benchmarks.

To corroborate these findings, some participants in the separate task-based study were asked to make a reservation on Southwest and United (in randomized order). After the experience, participants were asked which experience they preferred. Participants preferred Southwest over United by a 2: 1 ratio (p = .02) suggesting the numbers in the larger survey were not a result of some users having more experience with Southwest compared to United.

Participants also had a slightly harder time completing the tasks on American’s website as both task difficulty ratings and task confidence were statistically lower than Southwest and American. In observing the users attempt the task, several had problems selecting their return trips due to a complicated and overbearing seat upselling process (as shown in the screen shot below).


Figure 1: Users complained and had trouble with the seat upgrade/upsell screen when trying to select their routes on American’s website.

This is also a problem we found with the larger sample.

American: “I hate being asked more than once about whether I want to purchase upgraded seats. To print a boarding pass you have to say ‘no’ three times.”

Reflections on Interactions

We asked participants to reflect on their interactions with key functionality across the sites including the calendar, checking-in for a flight, choosing a seat, making a reservation, redeeming miles/points finding contact information and changing a reservation.

Participants reported higher satisfaction with the interactions on the aggregators compared to airline websites. Southwest had the highest satisfaction for the airlines sites and following the overall perceptions of the usability and quality, United had the lowest scores.

Key Drivers of UX Quality

Website developers can’t fix everything. To help prioritize the issues we conducted a Key Driver Analysis which allows us to statistically determine which aspects of the interactions are having the biggest impact (for better or worse) on the overall scores. The process involves using the beta weights from multiple-regression analysis in conjunction with the satisfaction ratings (which get converted into z-scores and percentages). The full details are available in the report.

The graph below shows which factors have the biggest impact on UX quality ratings based on the site type (aggregator vs. airline). Users are more satisfied making reservations on aggregators compared to airlines. Interacting with the calendar and the website speed were also better on aggregators.


Figure 2: Key drivers of UX quality by site type (aggregator vs. airline). For example, users are more satisfied making reservations on aggregators compared to airlines.

The Key Driver Analysis showed that the Check-In process had the biggest positive contribution for Southwest. Conversely, using miles and interacting with the calendar on American’s website were the biggest contributors to perceptions of a poorer experience. Users particularly like selecting seats on Delta’s website as it had both the highest rating across the websites and was the biggest driver of its UX quality score. The speed of the websites was a key driver in around half the websites with American being perceived as the slowest.

Mistakes

No one likes to make mistakes when traveling. It makes a trying experience even more difficult. We examined the mistakes users recall making across the websites (selecting the wrong date, wrong time or wrong cities). The most common mistake was picking the wrong date, which happened around 10% of the time. Users of American’s website had almost twice the mistake rate on picking the wrong city at 17%, which was statistically higher than the other sites.

This corroborates the findings that the calendar was a key driver of poor quality. It’s unclear from our task-based study as to why users are having a problem with the calendar or making more mistakes. It’s something an additional study can look into as going undetected can mean costly ticket changes and missed flights!

What to Fix

When reflecting on what areas of the website to fix, participants mention aesthetics the most. The visual appeal of a website plays an important role in a brand and establishing and maintaining trust. Users generally find usable websites more attractive. So improving appearance is both something that’s easy for users to mention and is likely a surrogate for improving the usability. We track appearance as part of the SUPR-Q and break it out in the report.

By far United had the most comments related to appearance, with 41% of participants mentioning something negative, often about the homepage.

United : “Cleaner look. There are so many different colors used on the main page that it’s distracting to look at & takes longer to figure out which part of the site is relevant.”

United : “The homepage looks extremely cluttered. It is daunting.”

Not surprisingly this was reflected in the standardized appearance score for United, which placed it at the 35th percentile relative to our database of 200 websites. The industry is at the 75th percentile, meaning as a group, these websites are more attractive than average (where average is the 50th percentile).

United has recently completed a massive merger with Continental. The challenges with integrating these airlines involve more than planes, routes and logos, it’s also involving a booking system and website that based on these scores, still need improvement.

Ads

On a similar note, many participants dislike the ads, especially on the aggregator sites.

Priceline “The ads at the bottom of the page are sloppy”

Travelocity “Fewer advertisements on the first page.”

Orbitz : “Not so many ads to click on. I got lost for a bit because I accidentally clicked on an ad.”

The next most common area of improvement is the flight narrowing and filtering capabilities.

Delta : “Under the ‘book a trip’ section, the flights could be better sorted with additional options (such as total flight time, rather than just connection time)”

Southwest : “More easily changing the dates once you get to the selection screen after choosing destinations.”

American: “Sometime actually too much info, I wish it would just give me top 5 flights within my time/price parameters with the option to expand if it doesn’t meet my requirements.”

The speed of the websites (which includes loading often hundreds of flight segments) was a common complaint as well.

Orbitz “a bit speedier response time when finding flights”

Expedia: “The speed it takes to load”

Brand Affinity

Website attitudes are often influenced by participants’ connection toward the brand. A few poor experiences can create strong negative affect and this influences people’s likelihood to reuse and recommend the airline (or aggregator). This in turn affects participants’ attitudes toward the websites. To measure brand affinity we used a three factor brand affinity scale[pdf].

We found that participants have slightly stronger attachments to aggregators than airline websites with scores about 5% higher for aggregators. United and Kayak has statistically lower brand connection scores than the group as a whole. Despite Southwest’s generally better reputation for customer service, its brand connection wasn’t distinguishable from American and Delta (although it’s loyalty scores did differ as did the overall scores).

Net Promoter Scores

The Net Promoter score is meant to be a proxy for future growth of products and services based on the word-of-mouth of customers. We extend that idea to websites and find that likelihood to recommend correlates with other UX metrics. The figure below shows how the usability scores fall precipitously when moving from promoters to detractors.


Figure 3: Relationship between NPS and usability. Usability scores explain 35% of the variability in likelihood to recommend scores.

The average Net Promoter Score for this group is 38%, similar to retail websites. Orbitz leads the pack with an NPS of 59%, followed by Southwest with 50%. United had the lowest NPS score at 6%, which is still slightly above the average NPS for all websites at -10%.

The Value of a Promoter

Promoters are users who gave a 9 or 10 on the likelihood to recommend question (NPS). We can estimate how valuable a promoter is for each website by determining the percent of customers who are referred to a website and who actually referred someone in the last year.

As a group, the aggregators rely more heavily on word-of-mouth than airline website (you can’t get on an Orbitz airplane…yet). On average aggregators net 58% more customers through referrals than the airlines. Consequently, aggregator websites need fewer promoters to generate new customers. For example, Kayak has the highest referral rate (with respondents reporting being referred and referring friends) so that for every 6 new promoters Kayak can expect 1 new customer.

More information is available for purchase in the full report.

Have a safe flight!

0
    0
    Your Cart
    Your cart is emptyReturn to Shop
    Scroll to Top