Comparison of UX Metrics in Moderated vs. Unmoderated Studies

Unmoderated Research

Browse Content by Topic

UX ( 54 )
Usability Testing ( 52 )
Statistics ( 51 )
Methods ( 50 )
Usability ( 32 )
Survey ( 31 )
User Research ( 27 )
Customer Experience ( 26 )
Benchmarking ( 23 )
SUS ( 19 )
Sample Size ( 18 )
NPS ( 17 )
Usability Problems ( 17 )
Usability Metrics ( 13 )
Rating Scale ( 13 )
SUPRQ ( 12 )
Qualitative ( 11 )
Net Promoter Score ( 11 )
Metrics ( 11 )
User Experience ( 10 )
Measurement ( 10 )
Navigation ( 10 )
Task Time ( 8 )
Surveys ( 8 )
Market Research ( 8 )
Questionnaires ( 7 )
Task Completion ( 7 )
Heuristic Evaluation ( 7 )
Reliability ( 6 )
Mobile ( 6 )
Mobile Usability Testing ( 6 )
Rating Scales ( 5 )
Six Sigma ( 5 )
Usability Problem ( 5 )
Visualizing Data ( 5 )
Questionnaire ( 5 )
Moderation ( 4 )
UX Maturity ( 4 )
Credibility ( 4 )
Quantitative ( 4 )
Confidence Intervals ( 4 )
Confidence ( 4 )
Research ( 4 )
Task Times ( 4 )
Validity ( 4 )
Analytics ( 4 )
UX Metrics ( 4 )
Satisfaction ( 4 )
Loyalty ( 4 )
Customer Segmentation ( 3 )
ROI ( 3 )
Usability Lab ( 3 )
Card Sorting ( 3 )
Expert Review ( 3 )
Unmoderated Research ( 3 )
SUPR-Q ( 3 )
Task Metrics ( 3 )
Lean UX ( 3 )
SEQ ( 2 )
Summative ( 2 )
Data ( 2 )
Excel ( 2 )
PhD ( 2 )
Eye-Tracking ( 2 )
Marketing ( 2 )
Branding ( 2 )
Correlation ( 2 )
Cognitive Walkthrough ( 2 )
A/B Testing ( 2 )
IA ( 2 )
Tree Testing ( 2 )
Findability ( 2 )
Focus Groups ( 2 )
SUM ( 2 )
Personas ( 2 )
Key Driver ( 2 )
UX Methods ( 2 )
UX Salary Survey ( 2 )
Remote Usability Testing ( 2 )
Salary Survey ( 2 )
Certification ( 2 )
Tasks ( 2 )
KLM ( 2 )
UMUX-lite ( 2 )
Perceptions ( 1 )
Performance ( 1 )
Prototype ( 1 )
Site Analytics ( 1 )
protoype ( 1 )
Metric ( 1 )
Moderating ( 1 )
moderated ( 1 )
Facilitation ( 1 )
Information Architecture ( 1 )
Affinity ( 1 )
Problem Severity ( 1 )
Task Completin ( 1 )
Mobile Usability ( 1 )
Contextual Inquiry ( 1 )
Z-Score ( 1 )
Think Aloud ( 1 )
User Testing ( 1 )
Effect Size ( 1 )
Persona ( 1 )
Segmentation ( 1 )
PURE ( 1 )
Software ( 1 )
Unmoderated ( 1 )
Design ( 1 )
Errors ( 1 )
Trust ( 1 )
Visual Appeal ( 1 )
True Intent ( 1 )
Top Task Analysis ( 1 )
Ordinal ( 1 )
Regression Analysis ( 1 )
Margin of Error ( 1 )
Task Randomization ( 1 )
Crowdsourcing ( 1 )
Sample ( 1 )
Five ( 1 )
Test Metrics ( 1 )
Quality ( 1 )
Expectations ( 1 )
Conjoint Analysis ( 1 )
Competitive ( 1 )
Formative ( 1 )
Random ( 1 )
Desktop ( 1 )
Unmoderated testing platforms allow for quick data collection from large sample sizes. This has enabled researchers to answer questions that were previously difficult or cost prohibitive to answer with traditional lab-based testing. But is the data collected in unmoderated studies, both behavioral and attitudinal, comparable to what you get from a more traditional lab setup? Comparing Metrics There are several ways to compare the agreement or

Read More

Small differences in design changes can have large consequences on website purchases. But detecting these small differences (e.g. 2%–10% changes) through behaviors and attitudes has generally not been feasible from traditional lab-based testing due to the time and costs of recruiting and facilitator costs/time. With unmoderated testing, organizations can now collect data from hundreds to thousands of participants quickly and from around the world to

Read More

To understand problems on a website, nothing quite beats watching users. The process provides a wealth of information both about what users can or can’t do and what might be causing problems in an interface. The major drawback to watching users live or recordings of sessions is that it takes a lot of focused time. 5 to 20 participants—the typical sample size in moderated studies—isn’t

Read More