Blogs/Design/3 Common UX Research Pitfalls That Designers Experience
Cover Image

3 Common UX Research Pitfalls That Designers Experience

4 minute read

3 Common UX Research Pitfalls That Designers Experience
1) You validated the design, but did you validate the problem?

Find the right problem before you find the right solution. You may be designing for an issue that doesn't exist or isn't a high priority. Consider the severity of the problem, the number of users affected, and how often it occurs.

💡
Pro Tip: Ask for evidence supporting the problem statement before designing anything. If none exists, validate the problem first.
2) Using 5 users for everything

Five users are only acceptable for usability testing, a task-based behavioral method. The way you should interpret the magic 5 is, “With 5 users, you have an 85% chance of finding the usability issues that affect 1/3 of all users.”

A visual depiction of what is being written about

This is cleverly explained in a book I strongly recommend, Think Like a UX Researcher.

For interviews, you’ll likely need between 8 and 15 users (per persona) to start finding reliable patterns. This can happen sooner or later when your interviews become repetitive, and you hardly discover new information until you have reached the qualitative saturation point (as explained by the Norman Nielsen Group).

Be careful with surveys; use a Sample Size Calculator.

Surveys are more advanced than they seem. What if you don’t get the number of participants needed? Say hello to the margin of error.

Understanding margin of error: The responses you receive in your survey will vary within a specific range. For example, if 54% of users answered "yes" and your margin of error is 15%, you should interpret it as ‘between 39% and 69% of users said yes.” Massive difference, huh? The margin of error decreases as the number of participants increases.

💡
Pro Tip: The number of participants you need varies by research method.
3) Drawing conclusions from the wrong methods

Example 1 – Interview: Imagine you interviewed 10 users, and 6/10 said they drink coffee because it is relaxing. It is incorrect to assume that 60% of your users do it because it is relaxing. Interviews are used to reveal “why” but not “how many.” You would need a quantitative method, like a survey, to confirm how many.

Example 2 – Analytics: Imagine you have a landing page, and analytics show a 70% abandon rate. It would be incorrect to assume your users aren’t interested in your product. In this imaginary case, analytics signal “what is happening” but can’t tell you “why.”

Perhaps they’re interested but confused by something. Perhaps there’s a bug, or they’re simply looking around before deciding to purchase. Follow up with qualitative methods, such as interviews, to reveal “why.”

💡
Pro Tip: Qualitative methods answer the “why & how.” Quantitative methods answer the “what & how many.”
Author Picture
By Daniel ChinchillaEvaluative User Researcher at CDK Global
Article uploaded on 07/19/23