\n
So, how do we measure UX? Well, let\u2019s find out! Meet a friendly welcome message to the video course, outlining all the fine details<\/strong> we\u2019ll be going through: design impact, business metrics, design metrics, surveys, target times and states, measuring UX in B2B and enterprises, design KPI trees, Kano model, event storming, choosing metrics, reporting design success \u2014 and how to measure design systems and UX research efforts.<\/p>\nKeywords:<\/strong>
Design impact, UX metrics, business goals, articulating design value, real-world examples, showcasing impact, evidence-driven design.<\/p>\n<\/p><\/div>\n<\/dd>\n\n\n
\n 2. Design Impact\n <\/div>\n<\/p><\/div>\n
\n
+<\/span>\n <\/div>\n<\/dt>\n\n\n
In this segment, we\u2019ll explore how and where we, as UX designers, make an impact<\/strong> within organizations. We\u2019ll explore where we fit in the company structure, how to build strong relationships with colleagues, and how to communicate design value in business terms.<\/p>\nKeywords:<\/strong>
Design impact, design ROI, orgcharts, stakeholder engagement, business language vs. UX language, Double Diamond vs. Reverse Double Diamond, risk mitigation.<\/p>\n<\/p><\/div>\n<\/dd>\n\n\n
\n 3. OKRs and Business Metrics\n <\/div>\n<\/p><\/div>\n
\n
+<\/span>\n <\/div>\n<\/dt>\n\n\n
We\u2019ll explore the key business terms and concepts<\/strong> related to measuring business performance. We\u2019ll dive into business strategy and tactics, and unpack the components of OKRs (Objectives and Key Results), KPIs, SMART goals, and metrics.<\/p>\nKeywords:<\/strong>
OKRs, objectives, key results, initiatives, SMART goals, measurable goals, time-bound metrics, goal-setting framework, business objectives.<\/p>\n<\/p><\/div>\n<\/dd>\n\n\n
\n 4. Leading And Lagging Indicators\n <\/div>\n<\/p><\/div>\n
\n
+<\/span>\n <\/div>\n<\/dt>\n\n\n
Businesses often speak of leading and lagging indicators \u2014 predictive and retrospective measures of success<\/strong>. Let\u2019s explore what they are and how they are different \u2014 and how we can use them to understand the immediate and long-term impact of our UX work.<\/p>\nKeywords:<\/strong>
Leading vs. lagging indicators, cause-and-effect relationship, backwards-looking and forward-looking indicators, signals for future success.<\/p>\n<\/p><\/div>\n<\/dd>\n\n\n
\n 5. Business Metrics, NPS\n <\/div>\n<\/p><\/div>\n
\n
+<\/span>\n <\/div>\n<\/dt>\n\n\n
We dive into the world of business metrics, from Monthly Active Users<\/strong> (MAU) to Monthly Recurring Revenue<\/strong> (MRR) to Customer Lifetime Value<\/strong> (CLV), and many other metrics that often find their way to dashboards of senior management.<\/p>\nAlso, almost every business measures NPS. Yet NPS has many limitations, requires a large sample size to be statistically reliable, and what people say and what people do are often very different things. Let\u2019s see what we as designers can do with NPS<\/strong>, and how it relates to our UX work.<\/p>\nKeywords:<\/strong>
Business metrics, MAU, MRR, ARR, CLV, ACV, Net Promoter Score, customer loyalty.<\/p>\n<\/p><\/div>\n<\/dd>\n\n\n
\n 6. Business Metrics, CSAT, CES\n <\/div>\n<\/p><\/div>\n
\n
+<\/span>\n <\/div>\n<\/dt>\n\n\n
We\u2019ll explore the broader context of business metrics, including revenue-related measures<\/strong> like Monthly Recurring Revenue (MRR) and Annual Recurring Revenue (ARR), Customer Lifetime Value (CLV), and churn rate.<\/p>\nWe\u2019ll also dive into Customer Satisfaction Score (CSAT) and Customer Effort Score (CES). We\u2019ll discuss how these metrics are calculated, their importance in measuring customer experience<\/strong>, and how they complement other well-known (but not necessarily helpful) business metrics like NPS.<\/p>\nKeywords:<\/strong>
Customer Lifetime Value (CLV), churn rate, Customer Satisfaction Score (CSAT), Customer Effort Score (CES), Net Promoter Score (NPS), Monthly Recurring Revenue (MRR), Annual Recurring Revenue (ARR).<\/p>\n<\/p><\/div>\n<\/dd>\n\n\n
\n 7. Feedback Scoring and Gap Analysis\n <\/div>\n<\/p><\/div>\n
\n
+<\/span>\n <\/div>\n<\/dt>\n\n\n
If you are looking for a simple alternative to NPS, feedback scoring and gap analysis might be a neat little helper. It transforms qualitative user feedback into quantifiable data<\/strong>, allowing us to track UX improvements over time. Unlike NPS, which focuses on future behavior, feedback scoring looks at past actions and current perceptions.<\/p>\nKeywords:<\/strong>
Feedback scoring, gap analysis, qualitative feedback, quantitative data.<\/p>\n<\/p><\/div>\n<\/dd>\n\n\n
\n 8. Design Metrics (TPI, SUS, SUPR-Q)\n <\/div>\n<\/p><\/div>\n
\n
+<\/span>\n <\/div>\n<\/dt>\n\n\n
We\u2019ll explore the landscape of established and reliable design metrics for tracking and capturing UX in digital products. From task success rate and time on task to System Usability Scale (SUS) to Standardized User Experience Percentile Rank Questionnaire (SUPR-Q) to Accessible Usability Scale (AUS), with an overview of when and how to use each<\/strong>, the drawbacks, and things to keep in mind.<\/p>\nKeywords:<\/strong>
UX metrics, KPIs, task success rate, time on task, error rates, error recovery, SUS, SUPR-Q.<\/p>\n<\/p><\/div>\n<\/dd>\n\n\n
\n 9. Design Metrics (UMUX-Lite, SEQ, UEQ)\n <\/div>\n<\/p><\/div>\n
\n
+<\/span>\n <\/div>\n<\/dt>\n\n\n
We\u2019ll continue with slightly shorter alternatives to SUS and SUPR-Q that could be used in a quick email survey or an in-app prompt \u2014 UMUX-Lite and Single Ease Question (SEQ). We\u2019ll also explore the \u201cbig behemoths\u201d of UX measurements \u2014 User Experience Questionnaire (UEQ), Google\u2019s HEART framework, and custom UX measurement surveys \u2014 and how to bring key metrics together in one simple UX scorecard<\/strong> tailored to your product\u2019s unique needs.<\/p>\nKeywords:<\/strong>
UX metrics, UMUX-Lite, Single Ease Question (SEQ), User Experience Questionnaire (UEQ), HEART framework, UEQ, UX scorecards.<\/p>\n<\/p><\/div>\n<\/dd>\n\n\n
\n 10. Top Tasks Analysis\n <\/div>\n<\/p><\/div>\n
\n
+<\/span>\n <\/div>\n<\/dt>\n\n\n
The most impactful way to measure UX is to study how successful users are in completing their tasks in their common customer journeys. With top tasks analysis, we focus on what matters, and explore task success rates<\/strong> and time on task<\/strong>. We need to identify representative tasks and bring 15–18 users in for testing. Let\u2019s dive into how it all works and some of the important gotchas and takeaways to consider.<\/p>\nKeywords:<\/strong>
Top task analysis, UX metrics, task success rate, time on task, qualitative testing, 80% success, statistical reliability, baseline testing.<\/p>\n<\/p><\/div>\n<\/dd>\n\n\n
\n 11. Surveys and Sample Sizes\n <\/div>\n<\/p><\/div>\n
\n
+<\/span>\n <\/div>\n<\/dt>\n\n\n
Designing good surveys is hard! We need to be careful on how we shape our questions to avoid biases, how to find the right segment of audience and large enough sample size, how to provide high confidence levels and low margins of errors. In this chapter, we review best practices<\/strong> and a cheat sheet<\/strong> for better survey design \u2014 along with do\u2019s and don\u2019ts on question types, rating scales, and survey pre-testing.<\/p>\nKeywords:<\/strong>
Survey design, question types, rating scales, survey length, pre-testing, response rates, statistical significance, sample quality, mean vs. median scores.<\/p>\n<\/p><\/div>\n<\/dd>\n\n\n
\n 12. Measuring UX in B2B and Enterprise\n <\/div>\n<\/p><\/div>\n
\n