It’s often called web surfing or web browsing, but it probably should be called web doing.
While there is still plenty of time to kill using the web, in large part, we’re all trying to get things done.
Purchasing, reserving, comparing and communicating—Internet behavior is largely a goal directed activity. If a website doesn’t help users accomplish their goals then it’s unlikely users will return and refer their friends.
In user research we break large goals down into byte sizes chunks we call tasks. Complete tasks (registering, entering payment details etc.) and you should eventually accomplish your goals.
Task completion is the fundamental usability metric. If users can’t accomplish what they’re trying to do, not much else matters.
Not surprisingly then, task failure has a negative effect on our impressions of website usability.
Were you able to accomplish what you were doing?
To understand just how impactful task failure is, I asked users of 59 websites to reflect on their most recent experience. The data was collected as part of the ongoing process of building and maintaining the database of websites for the SUPR-Q. In total data from almost 4000 users was analyzed and the websites range from ecommerce giants, to informational, education and governmental websites.
I asked users what they were doing on their last visit to the website (e.g. looking up car values, buying office furniture). I then asked whether they were able to successfully complete their intended tasks.
Here’s what I found:
- The average reported task completion rate was 88% with a high of 100% and a low of 54%.
- The 10 websites with the highest task completion rates were largely informational (CNN, New York Times, Flickr) but Amazon also made the cut. The average reported completion rate for this group was 95% to 100%.
- The 10 websites with the lowest reported task completion rates were a mix of dating websites (Match.com, eHarmony and PlentyofFish), two 3rd party automotive websites (Autotrader, Cars.com) and even the mighty Craigslist. The average reported completion rate for this group ranged from 80% down to 54%.
Here are some observations:
Dating is harder than reading: It’s interesting that three of the websites with the lowest task completion rates were dating websites. At first this seemed surprising, but in examining the tasks the users reported attempting it became obvious quickly. The major tasks on dating websites were not surprisingly, looking for dates (e.g. “I was trying to find a date”, “looking for matches for dating”) so the “success” rate speaks to a much larger (and more difficult) goal than say, buying a book.
People have imprecise and forgiving memories : Last year Jakob Nielsen reported on a study with an average task completion rate of 72%. Why was his lower? One reason is that the data in this article are self-reported accounts of past behavior compared to Nielsen’s’ which came from usability tests. People are notoriously inaccurate recorders of the past. So it’s not too surprising that the reported completion rate is higher.
What’s more, Nielsen limited his analysis to ecommerce sites. While the sample of websites in this analysis does include a substantial number of ecommerce websites, it also includes a number of informational, government, and education websites where the tasks are less structured (e.g. browsing information) and likely inflate the task completion rate.
Usability Tests are Different than Retrospective Accounts: We do the best we can to simulate real world actions in the usability lab, but many parts will always remain a bit artificial. When you ask users to attempt to accomplish tasks, even realistic ones, if they have little motivation or interest in solving the task then task completion rates will be lower. This is one reason why the average task completion rate I found across thousands of usability tasks was 78%–lower than this study’s retrospective rate but similar to Nielsen’s.
So what are the effects of task failure? Put succinctly, you’ll lose users.
If they fail, they won’t return
We won’t know if a user will visit a site again, but we can get some idea about the chances by just asking them. One of the loyalty items on the SUPR-Q is “I will likely visit this website in the future, ” and there are 5 response options, from 1 (strongly disagree) to 5 (strongly agree). For this analysis I narrowed the focus to only the extreme responders and looked at the ratio of 1’s to 5’s.
Around 60% of users gave the highest score of a 5, meaning they will likely visit the site again. Only 3% of users gave the lowest rating of a 1–those least likely to visit the site again.
|1 (Strongly Disagree)||37||95 (39%)|
|5 (Strongly Agree)||152||2214 (7%)|
Table 1: Number of users failing to accomplish their most recent task on a website and their likelihood of returning.
When we looked at task-failure, 39% of users who gave a 1 failed the task, whereas only 7% of users who gave a 5 failed the task.
Users are 5 times less likely to return to a website if they fail a task.
If they fail, they will tell!
In addition to not returning, the other harbinger of poor growth is if users are likely telling others about their bad experience. Just like future visits, we don’t know now if users will actually say bad things about the website.
I asked the question: “How likely is it that you’d recommend this website to a friend” on an 11 point scale (0= Not at all Likely; 10 = Extremely Likely) and computed the Detractors (responses from 0 to 6) and Promoters (responses from 9-10) as is done for the Net Promoter Score. The results were revealing.
Promoters made up 44% of users and Detractors made up 26% of users (Passives made up 30%). When we add task-failure to the equation we see that users who failed their most recent task were 3 times more likely to be Detractors (22%) than Promoters (7%) of the website.
Table 2: Number of users failing to accomplish their most recent task on a website and their likelihood of recommending to a friend.
This relationship drives home how the user experience has a strong effect on customer loyalty. Poor experiences lead to negative word of mouth.
Task completion is of course important and should be measured. Retrospective accounts of task completion are likely inflated. However, making comparisons across websites and on the same website over time despite this drawback is still a valuable way to track progress in design improvements.
This analysis helps quantify the impact of task failure. Not finding a spouse aside, when users fail to accomplish their goals, they are 5 times less likely to return and 3 times more likely to tell their friends not to visit the website.
Oh and in case you were curious, the website PlentyofFish.com had a reported task completion rate of 76%, which was statistically higher (p <.10) than Match.com (54%) and eHarmony (57%).