PURE

Browse Content by Topic

UX ( 69 )
Methods ( 61 )
Usability Testing ( 52 )
Statistics ( 51 )
Survey ( 34 )
Usability ( 32 )
Benchmarking ( 28 )
Customer Experience ( 27 )
User Research ( 27 )
NPS ( 25 )
SUS ( 21 )
Sample Size ( 19 )
Net Promoter Score ( 18 )
Usability Problems ( 17 )
Rating Scale ( 16 )
Metrics ( 15 )
Measurement ( 15 )
User Experience ( 14 )
Usability Metrics ( 13 )
SUPRQ ( 12 )
Validity ( 11 )
Qualitative ( 11 )
Navigation ( 10 )
Satisfaction ( 9 )
Questionnaires ( 8 )
Heuristic Evaluation ( 8 )
Market Research ( 8 )
Task Time ( 8 )
Surveys ( 8 )
Reliability ( 7 )
UX Metrics ( 7 )
Task Completion ( 7 )
Rating Scales ( 7 )
SUPR-Q ( 6 )
Questionnaire ( 6 )
Mobile Usability Testing ( 6 )
Mobile ( 6 )
Usability Problem ( 5 )
Visualizing Data ( 5 )
Research ( 5 )
Analytics ( 5 )
Six Sigma ( 5 )
UX Maturity ( 4 )
Confidence ( 4 )
Expert Review ( 4 )
Confidence Intervals ( 4 )
Task Times ( 4 )
Quantitative ( 4 )
Credibility ( 4 )
Loyalty ( 4 )
Moderation ( 4 )
UX Methods ( 4 )
Task Metrics ( 3 )
SEQ ( 3 )
ROI ( 3 )
Customer Segmentation ( 3 )
Lean UX ( 3 )
Unmoderated Research ( 3 )
Usability Lab ( 3 )
Card Sorting ( 3 )
UMUX-lite ( 3 )
PURE ( 3 )
Data ( 2 )
Eye-Tracking ( 2 )
PhD ( 2 )
Branding ( 2 )
Excel ( 2 )
Cognitive Walkthrough ( 2 )
KLM ( 2 )
Marketing ( 2 )
Summative ( 2 )
Personas ( 2 )
A/B Testing ( 2 )
Salary Survey ( 2 )
IA ( 2 )
Key Driver ( 2 )
Tasks ( 2 )
Findability ( 2 )
UX Salary Survey ( 2 )
SUM ( 2 )
Tree Testing ( 2 )
Remote Usability Testing ( 2 )
Correlation ( 2 )
Focus Groups ( 2 )
Perceptions ( 1 )
Performance ( 1 )
Mobile Usability ( 1 )
Moderating ( 1 )
Information Architecture ( 1 )
Certification ( 1 )
Facilitation ( 1 )
Site Analytics ( 1 )
protoype ( 1 )
Prototype ( 1 )
Contextual Inquiry ( 1 )
Affinity ( 1 )
Z-Score ( 1 )
Task Completin ( 1 )
Metric ( 1 )
Five ( 1 )
moderated ( 1 )
Problem Severity ( 1 )
Test Metrics ( 1 )
Segmentation ( 1 )
Persona ( 1 )
Software ( 1 )
NSAT ( 1 )
Ordinal ( 1 )
User Testing ( 1 )
Effect Size ( 1 )
True Intent ( 1 )
Visual Appeal ( 1 )
Top Task Analysis ( 1 )
Design ( 1 )
Unmoderated ( 1 )
Errors ( 1 )
Regression Analysis ( 1 )
Think Aloud ( 1 )
Margin of Error ( 1 )
Random ( 1 )
Crowdsourcing ( 1 )
Sample ( 1 )
Task Randomization ( 1 )
Quality ( 1 )
Trust ( 1 )
Conjoint Analysis ( 1 )
Expectations ( 1 )
Competitive ( 1 )
Formative ( 1 )
Desktop ( 1 )
Methods evolve and adapt. The same is true of UX methods that have evolved from other methods, often from disparate fields and dating back decades. The usability profession itself can trace its roots to the industrial revolution. The think aloud protocol, one of the signature methods of usability testing, can trace its roots to psychoanalysis, with influence from Freud, Wundt, and Skinner dating back over

Read More

The PURE (Practical Usability Rating by Experts) method is an analytic technique that identifies potential problems users may encounter with an interface. In a PURE evaluation, evaluators familiar with UX principles and heuristics break down tasks into small steps. They then rate each step, from 1 to 3, based on a pre-defined rubric. The higher the score, the more difficult the experience. As we continue to

Read More

In an earlier article, I described the PURE methodology. PURE stands for Practical Usability Rating by Experts. Evaluators familiar with UX principles and heuristics decompose tasks into small steps and rate each step based on a pre-defined scoring system (called a rubric), as shown in Figure 1. [table id=30 /] Figure 1: Scoring rubric for PURE. The PURE method is analytic. It’s not based on

Read More