Heuristic Evaluation

Browse Content by Topic

UX ( 68 )
Methods ( 61 )
Usability Testing ( 52 )
Statistics ( 51 )
Survey ( 34 )
Usability ( 32 )
Benchmarking ( 28 )
Customer Experience ( 27 )
User Research ( 27 )
NPS ( 23 )
SUS ( 21 )
Sample Size ( 18 )
Net Promoter Score ( 17 )
Usability Problems ( 17 )
Rating Scale ( 16 )
Measurement ( 15 )
Metrics ( 15 )
User Experience ( 14 )
Usability Metrics ( 13 )
SUPRQ ( 12 )
Qualitative ( 11 )
Validity ( 11 )
Navigation ( 10 )
Task Time ( 8 )
Heuristic Evaluation ( 8 )
Surveys ( 8 )
Market Research ( 8 )
UX Metrics ( 7 )
Questionnaires ( 7 )
Task Completion ( 7 )
Reliability ( 7 )
Questionnaire ( 6 )
Rating Scales ( 6 )
Mobile ( 6 )
Mobile Usability Testing ( 6 )
SUPR-Q ( 6 )
Analytics ( 5 )
Research ( 5 )
Satisfaction ( 5 )
Usability Problem ( 5 )
Six Sigma ( 5 )
Visualizing Data ( 5 )
Confidence ( 4 )
Loyalty ( 4 )
Quantitative ( 4 )
UX Maturity ( 4 )
Task Times ( 4 )
Credibility ( 4 )
Confidence Intervals ( 4 )
Expert Review ( 4 )
UX Methods ( 4 )
Moderation ( 4 )
Task Metrics ( 3 )
PURE ( 3 )
Customer Segmentation ( 3 )
Usability Lab ( 3 )
ROI ( 3 )
Lean UX ( 3 )
UMUX-lite ( 3 )
SEQ ( 3 )
Unmoderated Research ( 3 )
Card Sorting ( 3 )
A/B Testing ( 2 )
Cognitive Walkthrough ( 2 )
Findability ( 2 )
Eye-Tracking ( 2 )
Excel ( 2 )
Branding ( 2 )
Summative ( 2 )
Personas ( 2 )
Data ( 2 )
Correlation ( 2 )
Salary Survey ( 2 )
SUM ( 2 )
Focus Groups ( 2 )
Tasks ( 2 )
Key Driver ( 2 )
PhD ( 2 )
UX Salary Survey ( 2 )
Remote Usability Testing ( 2 )
KLM ( 2 )
Tree Testing ( 2 )
IA ( 2 )
Marketing ( 2 )
Perceptions ( 1 )
Mobile Usability ( 1 )
protoype ( 1 )
Facilitation ( 1 )
Site Analytics ( 1 )
Metric ( 1 )
moderated ( 1 )
Moderating ( 1 )
Certification ( 1 )
Information Architecture ( 1 )
Z-Score ( 1 )
Problem Severity ( 1 )
Affinity ( 1 )
Task Completin ( 1 )
Prototype ( 1 )
Contextual Inquiry ( 1 )
Performance ( 1 )
Random ( 1 )
Top Task Analysis ( 1 )
Software ( 1 )
Ordinal ( 1 )
True Intent ( 1 )
Trust ( 1 )
Design ( 1 )
Errors ( 1 )
Effect Size ( 1 )
Unmoderated ( 1 )
User Testing ( 1 )
Persona ( 1 )
Segmentation ( 1 )
Visual Appeal ( 1 )
Regression Analysis ( 1 )
Margin of Error ( 1 )
Task Randomization ( 1 )
Crowdsourcing ( 1 )
Sample ( 1 )
Five ( 1 )
Test Metrics ( 1 )
Quality ( 1 )
Expectations ( 1 )
Conjoint Analysis ( 1 )
Competitive ( 1 )
Formative ( 1 )
Think Aloud ( 1 )
Desktop ( 1 )
Methods evolve and adapt. The same is true of UX methods that have evolved from other methods, often from disparate fields and dating back decades. The usability profession itself can trace its roots to the industrial revolution. The think aloud protocol, one of the signature methods of usability testing, can trace its roots to psychoanalysis, with influence from Freud, Wundt, and Skinner dating back over

Read More

Expert reviews aren't' a substitute for usability testing and don't provide metrics for benchmarking. But they are an effective and relatively inexpensive way to uncover the more obvious pain points in the user experience. Expert reviews are best used when you can't conduct a usability test or in conjunction with insights collected from observing even just a handful of users attempting realistic tasks on a

Read More

It's been 25 years since the development of the Heuristic Evaluation—one of the most influential usability evaluation methods. It's cited as one of the most-used methods by practitioners. Yet its co-creator, Rolf Molich, had recently said it was "99% bad." To understand what would drive such a comment, you need to understand the history and purpose of the Heuristic Evaluation and the broader technique called

Read More

It's a question that's been around since Nielsen and Molich introduced the discount usability method in 1990. The idea behind discount usability methods, like heuristic evaluations in particular and expert reviews in general, is that it's better to uncover some usability issues --even if you don't have the time or budget to test actual users. That's because despite the rise of cheaper and faster unmoderated

Read More

Heuristic evaluations are one of the "discount" usability methods introduced over 20 years ago by Jakob Nielsen and Rolf Molich. In theory, a heuristic evaluation involves having a trained usability expert inspect an interface with compliance to a set of guiding principles (heuristics), such as Nielsen's 10 heuristics. In practice, most expert reviews of an interface, including cognitive walkthroughs, are often referred to as heuristic

Read More

There isn't a usability thermometer to tell us how usable an interface is. We observe the effects and indicators of bad interactions then improve the design. There isn't a single silver bullet technique or tool which will uncover all problems. Instead, practitioners are encouraged to use multiple techniques and triangulate to arrive at a more complete set of problems and solutions. Triangles of course have

Read More

Heuristic Evaluations and Cognitive Walkthroughs belong to a family of techniques called Inspection Methods. Inspection methods, like Keystroke Level Modeling, are analytic techniques. They don't involve users and tend to generate results for a fraction of the time and cost as empirical techniques like usability testing. They are also referred to as Expert Reviews because they are usually performed by an expert in usability or

Read More

A Heuristic evaluation is a process where someone trained in usability principles reviews an application (a website or software). She compares the website against a set of guidelines or principles ("Heuristics") that tend to make for more usable applications. For example, if while completing a task a user gets a message that says "Error 1000xz Contact System Administrator" this would violate a Heuristic: "Error messages

Read More