Methods

Browse Content by Topic

UX ( 62 )
Methods ( 52 )
Usability Testing ( 52 )
Statistics ( 51 )
Usability ( 32 )
Survey ( 31 )
User Research ( 27 )
Customer Experience ( 26 )
Benchmarking ( 26 )
SUS ( 21 )
NPS ( 20 )
Sample Size ( 18 )
Usability Problems ( 17 )
Net Promoter Score ( 14 )
Metrics ( 13 )
Rating Scale ( 13 )
Usability Metrics ( 13 )
SUPRQ ( 12 )
User Experience ( 11 )
Qualitative ( 11 )
Measurement ( 10 )
Navigation ( 10 )
Task Time ( 8 )
Market Research ( 8 )
Surveys ( 8 )
Task Completion ( 7 )
Questionnaires ( 7 )
Heuristic Evaluation ( 7 )
UX Metrics ( 7 )
Reliability ( 6 )
Mobile ( 6 )
Questionnaire ( 6 )
Mobile Usability Testing ( 6 )
Analytics ( 5 )
Usability Problem ( 5 )
Validity ( 5 )
Rating Scales ( 5 )
Visualizing Data ( 5 )
Six Sigma ( 5 )
Satisfaction ( 5 )
Research ( 4 )
Confidence ( 4 )
UX Maturity ( 4 )
Confidence Intervals ( 4 )
Task Times ( 4 )
Quantitative ( 4 )
Credibility ( 4 )
Loyalty ( 4 )
Moderation ( 4 )
Customer Segmentation ( 3 )
Expert Review ( 3 )
Card Sorting ( 3 )
SUPR-Q ( 3 )
SEQ ( 3 )
Task Metrics ( 3 )
Unmoderated Research ( 3 )
Lean UX ( 3 )
Usability Lab ( 3 )
ROI ( 3 )
Eye-Tracking ( 2 )
Excel ( 2 )
Summative ( 2 )
PhD ( 2 )
Salary Survey ( 2 )
IA ( 2 )
Key Driver ( 2 )
Marketing ( 2 )
Tree Testing ( 2 )
Data ( 2 )
UX Methods ( 2 )
Remote Usability Testing ( 2 )
SUM ( 2 )
Personas ( 2 )
A/B Testing ( 2 )
UX Salary Survey ( 2 )
Findability ( 2 )
KLM ( 2 )
Certification ( 2 )
Tasks ( 2 )
UMUX-lite ( 2 )
Correlation ( 2 )
PURE ( 2 )
Branding ( 2 )
Focus Groups ( 2 )
Cognitive Walkthrough ( 2 )
protoype ( 1 )
Z-Score ( 1 )
Facilitation ( 1 )
Perceptions ( 1 )
Site Analytics ( 1 )
Metric ( 1 )
Contextual Inquiry ( 1 )
Prototype ( 1 )
Mobile Usability ( 1 )
Performance ( 1 )
moderated ( 1 )
Affinity ( 1 )
Task Completin ( 1 )
Moderating ( 1 )
Information Architecture ( 1 )
Problem Severity ( 1 )
Test Metrics ( 1 )
Trust ( 1 )
Persona ( 1 )
Segmentation ( 1 )
Software ( 1 )
Ordinal ( 1 )
User Testing ( 1 )
Effect Size ( 1 )
True Intent ( 1 )
Visual Appeal ( 1 )
Top Task Analysis ( 1 )
Design ( 1 )
Unmoderated ( 1 )
Regression Analysis ( 1 )
Conjoint Analysis ( 1 )
Random ( 1 )
Margin of Error ( 1 )
Crowdsourcing ( 1 )
Sample ( 1 )
Five ( 1 )
Task Randomization ( 1 )
Quality ( 1 )
Competitive ( 1 )
Expectations ( 1 )
Formative ( 1 )
Errors ( 1 )
Think Aloud ( 1 )
Desktop ( 1 )
It was another busy year at MeasuringU. We posted 50 new articles, added new features to MUIQ—our UX testing platform, hosted our 6th UX Bootcamp, and released the book, Benchmarking the User Experience. We also moved into a bigger new space in Denver’s Cherry Creek neighborhood. It’s three times the size of our old space with state-of-the-art labs and we hosted UX Book Club this

Read More

The wide range of UX methods is one of the things that makes UX such an interesting field. Some methods have been around for decades (like usability testing), others are more recent additions, while some seem to be just slight variations on other existing methods. We’ve been tracking and analyzing the methods UX professionals report using for a few years by analyzing the results of the

Read More

UX research efforts should be driven by business questions and a good hypothesis. Whether the research is a usability evaluation (unmoderated or moderated), survey, or an observational method like a contextual inquiry, decisions need to be made about question wording, response options, and tasks. But in the process of working through study details, often the original intent of the study can get lost. At its

Read More

It was another busy year on MeasuringU.com with 50 new articles, a new website, a new unmoderated research platform (MUIQ), and our 5th UX Bootcamp. In 2017 over 1.2 million people viewed our articles. Thank You! The most common topics we covered include: usability testing, benchmarking, the 3Ms of methods, metrics and measurement, and working with online panels. Here’s a summary of the articles I

Read More

We conduct a lot of quantitative online research, both surveys and unmoderated UX studies. Much of the data we collect in these studies is from closed-ended questions or task-based questions with behavioral data (time, completion, and clicks). But just about any study we conduct also includes some open-ended response questions. Our research team then needs to read and interpret the free-form responses. In some cases,

Read More

The range of methods available to the researcher is one of the things that makes UX research such an interesting and effective field. The recently completed UXPA salary survey provides one of the more comprehensive pictures of the methods practitioners use. It contains data from over 1200 respondents from 37 countries collected in 2016. Similar data was collected in 2014 and 2011 with similarly sized

Read More

There’s a continued need to measure and improve the user experience. In principle, it’s easy to see the benefits of having qualified participants use an interface and measuring the experience to produce reliable metrics that can be benchmarked against. But in practice, a number of obstacles make it difficult: time, cost, finding qualified participants, and even obtaining a stable product to test. These challenges seem

Read More

There’s been a lot of buzz about design thinking. What it means, how to apply its principles, and even if it’s a new concept. While design thinking is usually applied to more scientific endeavors, scientific thinking can also benefit design. While it may be a less-trendy topic, a scientific approach to design is certainly effective. Scientific principles lead to better decision making about designs, which

Read More

It's better to be approximately right than exactly wrong. A version of those words came from an 18th century author named Carveth Read in a book on logic and reasoning. The quote is often misappropriated to John Maynard Keynes, the more famous economist and early statistician. Despite the age of the quote and misappropriation, it's sound wisdom for any researcher. It doesn't matter how precise

Read More

Your job title doesn't have to be "researcher" or "statistician" to use data to drive design decisions. You can apply some best practices even when numbers aren't your best friend. It's actually easier when you're a designer to enhance your skills with quantitative data than for statisticians to enhance their analytical skills with design principles. Here are six best practices for using numbers to inform

Read More