21. Design’s value is in these 4 activities
• Driving Understanding & Empathy
• Creating Clarity & Behavioral Fit
• Exploring the possible & desirable
• Envisioning experiences
23. Clarity &
Behavioral Fit
Providing the right structures to help
people achieve value.
Present words & images to maximize
efficient communication of meaning &
possibility.
Fit the behavior of the system(s) to the
behaviors of the humans who use them.
24. Exploring the
Possible & Desirable
Validation is not an end point,
but a beginning point.
Association of a multiplicity of
ideas is at the heart of
creativity.
25. Envisioning Ideas
& Experiences
Tell stories with words &
images to convey insights.
Using narratives to express
the human possibilities of a
design direction.
26. Some skills to create that value
• Storytelling
• Visual Thinking
• Information Structure, Navigation,
& Presentation
• Activity sequencing and modeling
• Workshop Facilitation
• Prototyping/Simulations
28. Different types of metrics
Types
•Quantitative data
•Qualitative data, quantified.
Collection Methods
•Self reported
•Gathered through automated instrumentation
30. Cascading use of data
Metric What is available to measure that relates to our hypothesis?
Correlation
If we compare the original metric to another metric can
that help understand the original hypothesis?
Interpretation
What does the correlation tell us?
Which directions make the desired effect?
Threshold
What measure of the metric will tell us we reached an
otherwise qualitative goal?
Trend
How do the prime metric and correlated metric compare over time?
How strong of a correlation in the trend would be significant? (differential)
Milestone
What can we map against a timeline to help us understand and interpret
possible moments of cause and effect? (such as releases, ship dates)
Baseline
The value of a metric or the combination of metrics at
beginning of any initiative.
Hypothesis What we want to learn from our data?
31. Setting up your measurement*
*extrapolated from a case study by Intuit
Data Type: Quantitative
Time Designing
Collected by: Self-reporting
Using tool such as Harvest
Desired Outcome
Increase design quality &/or
designer engagement
Hypothesis:
Increasing time designing will
increase design quality &/or
designer engagement
Measuring Desired Outcomes:
NPS, Heuristics, Usability
Testing, Customer
Satisfaction, etc.
33. For Correlations to work …
… The two metrics need to be plotted on the
same ratio.
… The correlation itself needs to be along a
continuous and steep grade.
… Major exceptions to the correlation over
time need to be followed up on thoroughly.
34. How can you assess quality?
Is the product organization aligned in their understanding of the
value of you design(ing) to the business & their customers? 2
1. There is no alignment across the product organization.
2. There have been gains in alignment seen by open trials of design and
research activities and processes.
3. Alignment is growing, as seen by more non-designers participating in design
activities.
4. Design value is well understood and consistently articulated across the
product organization.
36. Vital Signs
for DesignOps
Top 3-5 metrics that tell you
something might be wrong,
or everything is ok.
Possible Examples
• Number of UX stories
that started in a sprint’s
backlog, but didn’t get
deployed to prouction.
• Attrition rate within a
design team compared to
the whole organization.
• Time spent designing/
researching.
39. People
•Is recruitment leading to
best in class talent hired?
•Is the team engaged and
growing professionally?
•Are the values being upheld?
•Are diversity & inclusion
upheld as important values?
40. Workflow
• Teams are meeting the needs of
stakeholders?
• How much of the total design process are
team members being encouraged &/or
allowed to do?
• Are designs regularly being included in
shipped goods?
41. Communications
• Does the team have line of sight into the
team & business?
• Is the signal:noise ratio being managed?
• Is tribal knowledge available easily?
42. Tools
•Is the team able to get the
tools they need to be
successful & productive?
•Are tools easily integrated
to each other, and to the
broader set of
stakeholders (where
appropriate)?
43. Governance
• Are the mission & vision in place and well
understood?
• Are the team’s principles being used to
evaluate the quality of design work?
• Are decision making processes better
understood, and acted upon?
44. Business Ops
• Is the DesignOps team creating and
maintaining relationships with key
BusinessOps teams to ensure DxD smooth
functioning?
45. How do you know if you are
measuring the right things?
64. Impact of Agile
• Research, interpretations, synthesis, insights,
empathy are not part of most agile processes.
• Deliverable as “working software” doesn’t
encourage exploring and creating in the abstract.
• Single cadence for entire team doesn’t consider
that design & research practices works differently
than development.