Information Architecture Conference 2020 presentation. Learn how large-scale, anonymous usability studies (card sorts, tree testing, and first-click testing) can inform and improve formal in-person testing helping to create more citizen-centric Federal Government websites. Appendices include extensive source links.
1. All Card Sorts Great and Small
Reflections on Card Sorts, Citizen Engagement,
and Creating Better GovernmentWebsite IAs
Jeffrey Ryan Pass ⦙ IAC20:The Information Architecture Conference 2020 ⦙ @jeffpass ⦙ #IAC20
2. Contents
Preface: I’m Sorry,Who AreYou Again?
Disclaimer: Not My Average Presentation Deck
Chapter One: Experts say, Just Enough (Card Sort) Participants.
Chapter Two: WoahThere, Buckaroo, Uncle SamWants toWeigh In.
Chapter Three: Too Many Participants? Sounds Just About Right.
Chapter Four: All the Lessons Learned.
Appendices: All the Links and References.
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
4. I’m Jeffrey Ryan Pass…
• Designed my first website in 1997
(slice & optimize baby!)
• 21 years as a U.S. federal govt. UX
consultant (make it stop!)
• Longtime IA Summit / IA
Conference attendee, volunteer,
presenter, organizer
• Same forWIAD DC, UXDC and
DCUX, UXCamp DC, UXPA DC, and
UX COPs at past and present
employers
At IA Summit 2012 in NOLA
(photo: @wendywoowho)
In hiding at home for IA C20
(photo: @jeffpass)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
6. My typical deck versus this one
Typical
• Highly (overly) designed
• Image-heavy, no unifying format
• Riddled with pop culture references
• And provocative imagery
• Sorry about that ; )
This one
• Sparse (almost absent of) design
• Text-heavy, boring format, few images
• Riddled with data and source references
• And painful charts and graphs
• Sorry about that : (
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
7. Additional Disclaimers
• Despite the bucolic title page image (ruins along the River Corrib in Galway, Ireland), this
presentation does not reference AlfWright, his book, All Creatures Great and Small, or the
BBC television series, other than in its title/chapter pages and this disclaimer.Apologies
fans.
• This presentation revisits and expands on a 2013 case study presented at IA Summit and
UXPA International.
• This presentation contains screenshots of OptimalWorkshop tools (Optimal Sort,Treejack,
and Chalkmark).This is not intended as an explicit endorsement, though I love Optimal’s
tools.There are a plethora of excellent digital card sort, tree test, and first-click testing
tools. Find the one that works best for you, your project, employer, and/or client.
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
9. Card Sorts (101)
Participants place “cards” into categories
representing the structure or organization of a
site.Three basic types:
• Closed card sorts: cards placed into pre-
defined categories
• Open card sorts: cards placed into
participant-created categories
• Hybrid card sorts: cards placed into pre-
defined or participant created categories
Screenshot of an in-progress, closed card sort
(test & images: @jeffpass)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
10. TreeTests (101)
Participants identify where they would expect to
find content within a defined folder structure
• Like Windows Explorer Mac Finder
• Typically a question- or task-based study (as
opposed to card-based)
Screenshots of an in-progress, tree test (test & images: @jeffpass)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
11. Card Sort &TreeTest Sample Sizes
In general most people agree that you need:
• 15 – 20 participant sample for closed card sorts
• A 15 – 20% larger sampling for open card sorts
• An additional 10 – 15% larger sampling for tree tests
But honestly, is that based on anything substantive?
Turns out, “yes”, it is.
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
12. 29 (rounded up from 28.6)
16 (rounded up from 15.8)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
13. The eagle-eyed may have noticed some duplicate sources and wondered
about them.
Well, like most things, there is more to the numbers than just the numbers,
and a little bit of context (and inquiry) is useful.
Duplicate sources?
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
14. Duplicate sources examined (1/2)
Tullis &Wood (+Wood &Wood)
• 20041: original assessment of 20
minimum (min)
• 20052: 30 mini for when participants
only receive a sampling of cards
• 2008 (Wood & Wood)3: increased to 25
min adjusting for researcher
overestimation of user familiarity with
the domain being tested
Freed
• 20124: original assessment of 15
minimum based on Nielsen, but
recommended doubling that number
• 20185: revised down to align with Nielsen
with 15 as optimum
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
15. Duplicate sources examined (2/2)
Optimal Workshop
• 20136: original assessment of 20
min, 40 optimum, 60 maximum for an
open sort (20/40/60)
• 40/80/150 for a closed sort
• 10/120/400 for a tree test
• 20187: adjusted up (no reasoning
provided):
• 30/50/~ for sorts (open vs. closed not
specified)
• 50/100/~ for tree tests
UserZoom
• 20138: original assessment of 50
participants as optimal
• 20189: updated with detail:
• 9 min for identifying issues at 20%
problem/insight occurrence
• 65 min for KPIs w/90% confidence w/10%
margin of error
• 115 min for comparison testing
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
16. Importance of minimum sampling
For most test types, you need a minimum sampling to reach:
• Minimum correlation between the number of users and the results
• Nielsen recommends 15 participants for a “comfortable” 0.90 correlation10
• Reliable problem/insight occurrence
• UserZoom recommends 9 participants for a target 20% occurrence9
• Minimal margin of error
• UserZoom recommends 9 participants for a maximum 10% margin of error9
• Sampling requirements increase with complexity
• E.g., for calculating KPIs, testing randomized subsets of cards, etc.9
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
17. Disadvantages of oversampling
In general disadvantages relate to diminishing returns and workload:
• Nielsen points out that to increase from a correlation of 0.90 (15 users) to 0.95,
you need to double sample size, and study LOE, cost, etc.10 But…
• It’s less of an issue if your participants are volunteers (or ”voluntolds”)
• And tools like OptimalSort andTreejack include capabilities that make quick work of analysis
and (basic) reporting for even massive tests
• If qualitative elements are included, analysis, synthesis, and reporting becomes
more onerous as study size increases
• However, advances in AI and natural language processing mitigate this
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
19. So why oversample?
Well, the answer lies not in correlation between the
number of users and results, problem/insight
occurrence or margin of error.
It’s because of this guy…
Shepard Fairey’s Obama “Hope” poster
changing “Hope” to “Engage”
(original image: Smithsonian)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
20. The Digital Government Strategy
On May 23, 2012, the ObamaWhite HouseCIO and CTO
released two documents that kicked off a government-
wide digital modernization push:
• Presidential Memorandum: Building a 21st Century
Digital Government
• Digital Government: Building a 21st Century Platform
to Better Serve the American People
Together, these documents came to be known as the
Digital Government Strategy. Image of printed cover of Digital Government
Strategy
(cover: GPO; image: Digital.gov)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
21. Implementing the Strategy
The Digital Government strategy identified four over-arching principles to guide
the transformation of Federal Government websites:
• Information-Centric
• Shared Platform
• Customer-Centric (including voice of consumer/VOC feedback tools)
• Security and Privacy
To implement the strategy, one also had to comply with the laws, policies, and
regulations for federal agency websites and digital services. Here’s a list of just the
ones that those that relate to card sorts and tree tests…
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
22. Several rules and regulations must be considered… (1 of 4)
1998 - Government Paperwork EliminationAct (GPEA)
1988 - Section 508 of the Rehabilitation Act of 1973
1993 - Executive Orders 12866
1995 - Paperwork Reduction Act (44 U.S.C. 3501 et seq.)
1996 - Executive Orders 12988
2002 - E-Government Act of 2002, Section 207
2002 - Small Business Paperwork ReliefAct of 2002
2008 -Web Content AccessibilityGuidelines (WCAG) 2.0
2010 - OMB M-10-22, Guidance for Online Use ofWeb Measurement and Customization
Technologies
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
23. Several rules and regulations must be considered… (2 of 4)
2010 - OMB M-11-07, Facilitating Scientific Research by Streamlining the Paperwork Reduction Act
Process
2010 - PlainWritingAct of 2010
2010 - OMB Final Guidance on Implementing the PlainWritingAct of 2010
2010 - Executive Memorandum: Social Media, Web-Based InteractiveTechnologies, and the
Paperwork Reduction Act
2010 - Social Media,Web-Based InteractiveTechnologies, and the Paperwork Reduction Act
2010 -The Privacy Act of 1974, as amended, 5 U.S.C. § 552a
2011 - Paperwork Reduction Act FastTrack Process
2011 - OMB M-11-26, Fast-Track Process for Collecting Service Delivery Feedback Under the
Paperwork Reduction Act
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
24. Several rules and regulations must be considered… (3 of 4)
2011 - Executive Order 13571 – Streamlining Service Delivery and ImprovingCustomer Service
2011 - New Fast-Track Process for Collecting Service Delivery Feedback Under the Paperwork Reduction
Act
2012 - Digital Government Strategy
2012 - OMB Memo onTesting and Simplifying Federal Forms
2013 - Memorandum on Social Media,Web-Based InteractiveTechnologies, and the Paperwork Reduction
Act
2015 - U.S.Web Design System (USWDS) & USWDS Standards
2016 - OMB M-17-06, Policies for Federal Agency PublicWebsites and Digital Services
2016 - OMB Circular A-130, Managing Information as a Strategic Resource
2016 - Federal Collection of Information (Resource Collection)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
25. Several rules and regulations must be considered… (4 of 4)
2016 - Memorandum on Flexibilities under the Paperwork Reduction Act for Compliance with
Information Collection Requirements
2017 - Information and CommunicationTechnology (ICT) Standards and Guidelines
2018 - 21st Century Integrated Digital ExperienceAct (21st Century IDEAAct)
2018 - 21st Century Integrated Digital ExperienceAct (21st Century IDEAAct)
2018 - Connected Government Act, January 2018
2019 - RequiredWeb Content and Links
2019 - OMB Circular A-11 Section 280, Managing Customer Experience and Improving Service
Delivery
2020 - Digital.govTools and Services
2020 - Checklist of Requirements for FederalWebsites and Digital Services
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
26. Let’s just focus on a few…
• The Paperwork Reduction Act (PRA)
• And related Fast-Track rules and regulations
• Executive Orders and Memoranda clarifying PRA
• And mitigating its deterrents to implementing the Digital Government
Strategy
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
27. The Paperwork Reduction Act (PRA) of 1995
• Established to reduce the amount of paperwork burden imposed on citizens and
private businesses
• And to limit the collection of Personally Identifiable Information (PII)
• Requires U.S. federal government agencies to obtain Office of Management and
Budget (OMB) approval before collecting most types of information from the
public
• Includes two processes for obtaining clearance through OMB:
• TraditionalClearance (6-9 months)
• Fast-TrackClearance (~5-days per request, but qualifying takes 6+ months)
• Is the bane of many IA, UX, CS, and CX professional’s existence
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
28. The PRA Applies
• To any data collection vehicle involving more than 10 respondents and
utilizing standardized questions
• To any type of respondent (individuals, organizations, states, etc.)
• Regardless of form or format (interview, survey, etc.)
• Regardless of collection method (in-person, telephonic, electronic)
Important exception: federal employees and contractors are exempt
from PRA (test ‘em to your heart’s content)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
29. Federal Agencies and PRA
• Agencies are required to have a PRA Clearance Officer
• Sometimes combined with other compliance roles (e.g., FOIA, Privacy, etc.)
• Sometimes residing within the cabinet-level parent agency
• Never start work on aTraditional Clearance request before
confirming whether the agency is Fast-Track approved
• If your agency does not have Fast-Track, work with your PRA Clearance
Officer to pursue it
• The level of effort (LOE) is comparable to securing aTraditionalClearance, but once
attained, individual clearances can be achieved more quickly and easily
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
30. Amendments and Clarifications to PRA (1 of 3)
Any number of documents extend, modify, or clarify the PRA:
• Executive Memorandum: Social Media, Web-Based Interactive
Technologies, and the Paperwork Reduction Act (2010) clarifies:
• If no Personally Identifiable Information (PII) is collected (beyond name and email
address), and repetitive, structured, responses are not required, the online data
collection activity *may* not be subject to PRA
• Electronic mailing addresses collected for agency mailing lists are not subject to PRA,
nor are "suggestions boxes" (opt-in feedback collection initiatives)
• Certain uses of social media and web-based interactive technologies will be treated as
equivalent to activities that are currently excluded from the PRA
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
31. • Social Media, Web-Based InteractiveTechnologies, and the Paperwork
Reduction Act (2010) clarifies:
• Under established principles, the PRA does not apply to many uses of media and
technologies
• OMB M-11-26, Fast-Track Process forCollecting Service Delivery Feedback
Under the Paperwork Reduction Act (2011) establishes:
• Fast-Track process that (together with required generic clearance)
• Memorandum on Social Media, Web-Based InteractiveTechnologies, and
the Paperwork Reduction Act (2013) establishes:
• Social media and web-based interactive technologies require no PRA approval
Amendments and Clarifications to PRA (2 of 3)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
32. • Memorandum on Flexibilities under the Paperwork Reduction Act for
Compliance with Information Collection Requirements (2016) clarifies:
• PRA to exclude most social media and web-based interactive technologies
• OMB CircularA-11 Section 280, Managing Customer Experience and
Improving Service Delivery (2019) establishes:
• Expectations for Federal Government customer experience as well as Service Delivery
and promotes a "CX-mindful culture across Federal Government services" requiring
measurement and implying continuous improvement
Amendments and Clarifications to PRA (3 of 3)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
34. Case Study 1: CFTC - CFTCnet (1/2)
2010 card sort of proposed CFTC intranet IA:
• Initial email invite sent to all-CFTC by team
• Completion rate of < 3%
• Subsequent email invite sent by agency CTO
• 95 participants (69% completion rate)
• 45% organizational completion rate (CFTC then had ~210
staff)
• Anonymous study; participants asked for
department and role via radios (optional)
• Completion rate of optional questions < 24%
Seal and logo/logotype:CFTC
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
35. Case Study 1: CFTC - CFTCnet (2/2)
Lessons learned:
• PRA does not apply to federal government
employees and contractors
• Have authority figures send study invites
• Incentivize would-be participants
• Don’t except high completion rates on optional self-
identification (no PII) questions
• Find a PRA-compliant approach to data collection
Seal and logo/logotype:CFTC
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
36. Case Study 2: SAMHSA – SAMHSA.gov (1/4)
2011–2013 iterative tree tests and card sorts of
extant and proposed SAMHSA.gov IAs:
• Effort to consolidate 100 sites to one, including
parallel internal and external test activities
• Internal staff (employees and contractors):
• Identified by departments; each invited only once
• Most email invites sent out from comms senior staff
• Anonymous; no collection of department/role info
• Testing identical IAs as the external studies, but with staff-
specific instructions and cards
Logo and logotype: SAMHSA
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
37. Case Study 2: SAMHSA – SAMHSA.gov (2/4)
External domain professionals and interested members of
the general public (no incentive):
• Recruits: via SAMHSA Facebook andTwitter posts
describing the project and requesting help
• Screeners: posts pointed to a SAMHSA blog posts
describing the project and each study (including past
study results); separate participation links provided for
professionals and the general public
• Tests: participation links pointed to card sorts and tree
tests with identical IAs, but audience-specific instructions
and cards
Logo and logotype: SAMHSA
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
38. Test Audience Completed Initiated Completion
Extant IA Tree Test (2011)
555 total participants
Professionals 821 1065 77%
Employees + contractors 30 50 60%
Proposed IA Closed Card
Sort 1 (2012)
555 total participants
General public 150 331 45%
Domain professionals 329 633 52%
Employees + contractors 2 3 40%
Proposed IA Closed Card
Sort 2 (2012)
555 total participants
General public 202 375 54%
Domain professionals 875 1282 68%
Employees + contractors 71 107 66%
Extant IA (Section) Open
Card Sort (2013)
555 total participants
General public 96 200 48%
Domain professionals 312 505 62%
Employees + contractors 44 77 57%
Case Study 2: SAMHSA – SAMHSA.gov (3/4)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
39. Case Study 2: SAMHSA – SAMHSA.gov (4/4)
Lessons learned:
• Public-facing documentation of project, approach
and results highly successful
• Anonymous opt-in recruiting approach not subject
to PRA
• Cloned tests (of identical IAs) allowed for high-level test
result segmentation
• Have authority figures send all internal invites
• Only staff completion rate below 57% was for an invite
sent out by the project team
Logo and logotype: SAMHSA
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
40. Case Study 3: USPS – LiteBlue & USPS.com (1 of 6)
2014-2020 card sorts and tree tests of
management intranet and USPS.com IAs:
• LeveragedCFTC and SAMHSA approaches to
improve management intranet, LiteBlue (note:
“Blue” is the non-management intranet)
• Based on success of intranet testing, adopted
large-scale card sorts, tree tests, and first-click
tests for USPS.com
®
Logo and logotype:USPS
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
41. Case Study 3: USPS – LiteBlue & USPS.com (2 of 6)
USPS Intranet testing:
• Recruiting pool identified by senior staff
• Test goals and request for participation introduced
byVP for HR on senior staff monthly status call
• Senior staff sent email invites to their departments
• Anonymous, but required fields for identifying high-
level department and role affiliation
®
Logo and logotype:USPS
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
42. Case Study 3: USPS – LiteBlue & USPS.com (3 of 6)
USPS.com testing:
• Iterative testing of overall site and section IAs
• Recruiting from customer feedback responses
• Includes opt-in for usability study participation
• Email invites sent from support@USPS.gov including separate
links for consumers vs. business users
• Opt-in links for consumer and business tests with same IAs
but audience-based directions and cards
• Card sorts and tree tests also incorporated into usability
tests (for contextual inquiry)
®
Logo and logotype:USPS
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
43. Test Audience Completed Initiated Completion
LiteBlue Intranet IA Tree
Test (2014)
Management staff 234 261 90%
USPS.com Simplification
IA Tree Test (2014)
282 total participants
Consumers 188 198 95%
Business users 94 99 95%
USPS.com Tracking First-
Click Test (2016)
378 total participants
Desktop users 260 326 80%
Mobile Users 118 151 78%
Case Study 3: USPS – LiteBlue & USPS.com (4 of 6)
All three tests had amazing response rates and resulted in demonstrable improvements.
And then we decided to do something rather insane…
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
44. Test Audience Completed Initiated Completion Invites
USPS.com IA
Comparison Card
Sorts (2019)
2 IAs + and open sort
2 groups for each sort
231 total participants
Consumers
(current IA, closed sort)
93 169 55% 900 invites
22% started
55% finishedBusiness users
(current IA, closed sort)
15 29 52%
Consumers
(new IA, closed/hybrid sort)
60 85 71% 900 invites
11% started
69% finishedBusiness users
(new IA, closed/hybrid sort)
10 17 59%
Consumers
(open sort)
44 98 45% 900 invites
12% started
48% finishedBusiness users
(open sort)
9 13 69%
Case Study 3: USPS – LiteBlue & USPS.com (5 of 6)
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
45. Case Study 3: USPS – LiteBlue & USPS.com (6 of 6)
®
Logo and logotype:USPS
Lessons learned:
• You do not have to incentivize a captured, helpful audience
to get an amazing response rate
• Oversampling for first-click tests produces similarly positive
results to card sorts and tree tests
• Ambitious, comparison test result certainty can be
undermined by under sampling on one or two sorts
• Despite very positive completion rates
• Email invites had low start, but good completion rates
• Use small-scale testing for qualitative learnings
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
47. Oversampling advantages (already covered)
• Despite diminishing returns, oversampling can:
• Increases correlation between the number of users and results, as well as
problem/insight occurrence9, 10
• Reduces margin of error9
• Modern tools automate much analysis and basic reporting
• Especially for carefully designed tests (minimize qualitative data collection)
• User engagement can create buzz and promote participation, even without
incentives being paid
• Novel (Social Media- and feedback-based) recruiting methods work well
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
48. But remember…
• For internal testing, ensure invites are sent out by authority figures (and
that there is leadership buy-in)
• PRA doesn’t apply to federal employees and contractors
• Anonymous, opt-in categorization enables large scale studies without
capturing PII or triggering PRA
• But you’ll need to clone the test for each category
• Don’t expect high start rates from email invites or completion rates if using
lots of qualitative data collection
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
49. Oversampling advantages (the new ones)
• Big numbers can impress stakeholders and generate
momentum behind IA modifications or improvements
• They may only minimally increase confidence, but they tap into
government directives related to citizen engagement and therefore
excite senior staff
• Allows you to engage ”captured” audiences (like feedback
respondents)
• Gives users a sense of progress and of being part of the process of
continuous improvement
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
50. Appendices
All the Links and References
• Endnotes
• Additional sources
• Government website requirements
51. Endnotes
1. Tullis, T. S., & Wood, L. E. (2004). How Many Users Are Enough for a Card-Sorting Study? UPA 2004 Conference, Minneapolis, NM. Retrieved 12/21/2012 from
http://home.comcast.net/~tomtullis/ publications/UPA2004CardSorting.pdf. Most recently retrieved 04/06/2020 from:
https://www.researchgate.net/publication/254164354_How_Many_Users_Are_Enough_for_a_Card-Sorting_Study
2. Tullis, T. S., & Wood, L. E. (2005). How Can You Do a Card-sorting Study with LOTS of Cards? UPA 2005 Conference, Montreal, Quebec, Canada. Retrieved 12/21/2012 from http://
www.eastonmass.net/tullis/presentations/Tullis&Wood- CardSorting.pdf. Most recently retrieved 04//06/2020 from:
https://web.archive.org/web/20130401051722/http://www.eastonmass.net/tullis/presentations/Tullis&Wood-CardSorting.pdf
3. Wood, J. R., & Wood, L. E. (2008). Card Sorting: Current Practices and Beyond. Journal of Usability Studies, Volume 4, Issue 1, 11/ 2008. Retrieved 03/12/2013 from http://www.upassoc.org/
upa_publications/jus/2008november/wood3.html. Most recently retrieved 04/06/2020 from:
https://web.archive.org/web/20111209083121/http://www.upassoc.org/upa_publications/jus/2008november/wood3.html
4. Freed, E. (2012). How-To Guide for Intranet Card Sorting. The Social Intranet Blog (09/11/2012). Originally retrieved 03/12/2013 from http://www.thoughtfarmer.com/blog/2012/09/11/intranet-
card-sorting/; most recently retrieved on 04/05/2020 from https://web.archive.org/web/20130623074139/.
5. Freed, E. (2018). How-To Guide for Intranet Card Sorting. The Social Intranet Blog (09/06/2018). Retrieved 04/05/2020 from https://www.thoughtfarmer.com/blog/intranet-card-sorting/.
6. OptimalWorkshop (2011). How Many Participants Do I Need for My Survey? (And How Many Should I Invite?). Optimal Workshop Support Knowledge Base 11/14/2011. Retrieved 03/12/2013
from http://www.optimalworkshop.com/help/kb/remote-user-testing/how-many-participants-do-i-need-for-my-survey-and-how-many-should-i-invite.
7. Optimal Workshop (2018). How many participants you need for useful data. Optimal Workshop Learn Center. Retrieved 04/05/2020 from:
https://support.optimalworkshop.com/en/articles/2626908-how-many-participants-you-need-for-useful-data.
8. UserZoom (2011). Online Card Sorting: What, How & Why? UserZoom 01/20/2011. Retrieved 03/12/2013 from http://www.userzoom.com/online-card-sorting-what-how-why/.
9. Kelkar, K., UserZoom (2018). Demystifying Sample Size: How Many Participants Do You Really Need for UX Research. Retrieved 04/05/2020 from:
https://www.userzoom.com/blog/webinar/recording-demystifying-sample-size/.
10. Nielsen, J. (2004). Card Sorting: How Many Users to Test. Jakob Nielsen’s Alertbox: July 19, 2004. Retrieved 12/21/2012 and again on 03/10/2020 from http://www.nngroup.com/articles/card-
sorting-how-many-users-to-test/.
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
52. Additional sources (1 of 2)
Gaffney, G. (2000). What is Card Sorting? Information & Design, 2000. Retrieved 03/12/2013 from http://www.ida.liu.se/~TDDD26/ material/CardSort.pdf. Most recently retrieved 04/06/2020
from: https://www.academia.edu/378581/What_is_Card_Sorting.
Paul, C. L. (2008). A Modified Delphi Approach to a New Card Sorting Methodology. JUS Journal of Usability Studies, Volume 4, Issue 1, November 2008. Retrieved 03/12/2013 from
http://www.academia.edu/150978/A_Modified_Delphi_Approach_to_a_New_Card_Sorting_Methodology.
Robertson, J. (2001). Information Design Using Card Sorting. Step Two Designs, 02/19/2001. Retrieved 04/06/2020 from http://www.steptwo.com.au/papers/cardsorting/index.html .
Sachs, J. (2002). Aristotle's Metaphysics. Green Lion Press, Santa Fe, NM.
Spencer, D., & Warfel, T. (2004). Card Sorting: A Definitive Guide. Boxes and Arrows 04/07/2004. Retrieved 04/05/2020 from https://boxesandarrows.com/card-sorting-a-definitive-guide/.
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
53. Additional sources (2 of 2)
Linked Data:
• Full slide deck at SlideShare: https://www.slideshare.net/JeffreyRyanPass/IAC20-All-Cardsorts-Great-and-Small-by-Jeff-Pass
• Card Sort Minimum and Optimum Size Sources (spreadsheet including source links):
https://docs.google.com/spreadsheets/d/1of0JYMTTD9nENNpL_zI8I5fj8GNZrQxo6JmPkUFjF2M/edit?usp=sharing
• Requirements for Federal Website Sources (spreadsheet including source links): https://docs.google.com/spreadsheets/d/18CIEt9CAy2mxSdU1OAUjrBwV9lgAxBrytHRiFg861-
M/edit?usp=sharing
Imagery:
• All photography by Jeffrey Ryan Pass unless noted otherwise
• All OptimalSort and Treejack screenshots from studies designed and facilitated by Jeffrey Ryan Pass using the Booz Allen Hamilton Optimal Workshop enterprise license
• CFTC, SAMHSA, and USPS logos and logotype all unaltered, all public domain
• Original Shepard Fairey “Hope” print downloaded from the Smithsonian National Portrait Gallery (print owner)
• Link: https://npg.si.edu/object/npg_NPG.2013.46?destination=node/63231%3Fedan_q%3Dfairey%2520hope
• Manipulation and usage consistent with Smithsonian Terms of Use: https://www.si.edu/Termsofuse
• Digital Government Strategy image from Digital.gov; design and printing by GPO: https://digital.gov/2012/06/27/hitting-the-ground-running-with-the-digital-strategy/
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
54. Government website requirements (partial list) (1 of 3)
This listing captures U.S. Federal Government rules, regulations, memoranda, and other guidance that relates to card sorts, tree tests, and similar online, unmoderated
usability studies. It is not a complete listing of all requirements for government websites and digital applications. Additionally, these requirements are always evolving
(obviously), so this listing should not be considered all-inclusive or complete and current beyond April 2020.
Government Paperwork Elimination Act (GPEA) (1998): https://digital.gov/resources/implementation-of-the-government-paperwork-elimination-act/ and https://obamawhitehouse.archives.gov/omb/fedreg_gpea2/
Section 508 of the Rehabilitation Act of 1973: https://www.section508.gov/manage/laws-and-policies and https://www.access-board.gov/guidelines-and-standards/communications-and-it/about-the-ict-refresh/final-
rule
Executive Orders 12866: https://www.archives.gov/files/federal-register/executive-orders/pdf/12866.pdf
Paperwork Reduction Act (44 U.S.C. 3501 et seq.): https://www.govinfo.gov/content/pkg/PLAW-104publ13/html/PLAW-104publ13.htm and
https://digital.gov/resources/paperwork-reduction-act-44-u-s-c-3501-et-seq/
Executive Orders 12988: https://www.govinfo.gov/content/pkg/WCPD-1996-02-12/pdf/WCPD-1996-02-12-Pg189.pdf
E-Government Act of 2002, Section 207: https://www.archives.gov/about/laws/egov-act-section-207.html and https://www.govinfo.gov/app/details/PLAW-107publ347
Small Business Paperwork Relief Act of 2002: https://www.govinfo.gov/content/pkg/STATUTE-116/pdf/STATUTE-116-Pg729.pdf and https://www.govinfo.gov/app/details/PLAW-107publ198
Web Content Accessibility Guidelines (WCAG) 2.0: https://www.w3.org/WAI/standards-guidelines/wcag/ and https://www.w3.org/TR/WCAG20/
OMB M-10-22, Guidance for Online Use of Web Measurement and Customization Technologies: https://obamawhitehouse.archives.gov/sites/default/files/omb/assets/memoranda_2010/m10-22.pdf
OMB M-11-07, Facilitating Scientific Research by Streamlining the Paperwork Reduction Act Process: https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/memoranda/2011/m11-07.pdf
Plain Writing Act of 2010: https://plainlanguage.gov and https://www.govinfo.gov/app/details/PLAW-111publ274
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
55. Government website requirements (partial list) (2 of 3)
OMB Final Guidance on Implementing the Plain Writing Act of 2010: https://plainlanguage.gov/ and https://obamawhitehouse.archives.gov/sites/default/files/omb/memoranda/2011/m11-15.pdf
Executive Memorandum: Social Media, Web-Based Interactive Technologies, and the Paperwork Reduction Act:
https://obamawhitehouse.archives.gov/sites/default/files/omb/assets/inforeg/SocialMediaGuidance_04072010.pdf and https://digital.gov/resources/social-media-web-based-interactive-
technologies-and-the-paperwork-reduction-act/
Social Media, Web-Based Interactive Technologies, and the Paperwork Reduction Act: https://digital.gov/resources/social-media-web-based-interactive-technologies-and-the-paperwork-
reduction-act/
The Privacy Act of 1974, as amended, 5 U.S.C. § 552a: https://www.justice.gov/opcl/privacy-act-1974, https://www.govinfo.gov/content/pkg/USCODE-2018-title5/pdf/USCODE-2018-title5-partI-
chap5-subchapII-sec552a.pdf, and https://www.govinfo.gov/app/details/USCODE-2010-title5/USCODE-2010-title5-partI-chap5-subchapII-sec552a/summary
Paperwork Reduction Act Fast Track Process: https://obamawhitehouse.archives.gov/sites/default/files/omb/assets/inforeg/PRA_Gen_ICRs_5-28-2010.pdf,
https://digital.gov/resources/paperwork-reduction-act-fast-track-process/, and https://www.usability.gov/how-to-and-tools/guidance/fast-track-clearance-process.html
OMB M-11-26, Fast-Track Process for Collecting Service Delivery Feedback Under the Paperwork Reduction Act:
https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/memoranda/2011/m11-26.pdf
Executive Order 13571 – Streamlining Service Delivery and Improving Customer Service (April 2011): https://obamawhitehouse.archives.gov/the-press-office/2011/04/27/executive-order-13571-
streamlining-service-delivery-and-improving-custom
New Fast-Track Process for Collecting Service Delivery Feedback Under the Paperwork Reduction Act (June 15, 2011):
https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/memoranda/2011/m11-26.pdf
Presidential Memorandum: Building a 21st Century Digital Government: https://obamawhitehouse.archives.gov/the-press-office/2012/05/23/presidential-memorandum-building-21st-century-
digital-government
Digital Government: Building a 21st Century Platform to Better Serve the American People: https://obamawhitehouse.archives.gov/sites/default/files/omb/egov/digital-government/digital-
government.html
Digital Government Strategy (May 2012): https://obamawhitehouse.archives.gov/sites/default/files/omb/egov/digital-government/digital-government.html
OMB Memo on Testing and Simplifying Federal Forms: https://obamawhitehouse.archives.gov/sites/default/files/omb/inforeg/memos/testing-and-simplifying-federal-forms.pdf
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
56. Government website requirements (partial list) (3 of 3)
Memorandum on Social Media, Web-Based Interactive Technologies, and the Paperwork Reduction Act: https://www.govinfo.gov/content/pkg/CFR-2014-title3-vol1/pdf/CFR-2014-title3-vol1-
eo13642.pdf and https://obamawhitehouse.archives.gov/the-press-office/2013/05/09/executive-order-making-open-and-machine-readable-new-default-government-
U.S. Web Design System (USWDS) & USWDS Standards: https://designsystem.digital.gov/
OMB M-17-06, Policies for Federal Agency Public Websites and Digital: https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/memoranda/2017/m-17-06.pdf
OMB Circular A-130, Managing Information as a Strategic Resource (July 28, 2016): https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/circulars/A130/a130revised.pdf
Federal Collection of Information (Resource Collection): https://obamawhitehouse.archives.gov/omb/inforeg_infocoll/
Memorandum on Flexibilities under the Paperwork Reduction Act for Compliance with Information Collection Requirements:
https://obamawhitehouse.archives.gov/sites/default/files/omb/inforeg/pra_flexibilities_memo_7_22_16_finalI.pdf
Information and Communication Technology (ICT) Standards and Guidelines: https://www.access-board.gov/guidelines-and-standards/communications-and-it/about-the-ict-refresh/final-
rule/text-of-the-standards-and-guidelines#C203-electronic-content and https://www.federalregister.gov/documents/2017/01/18/2017-00395/information-and-communication-technology-ict-
standards-and-guidelines
21st Century Integrated Digital Experience Act (21st Century IDEA Act) [Explanation]: https://digital.gov/resources/21st-century-integrated-digital-experience-act/
21st Century Integrated Digital Experience Act (21st Century IDEA Act): https://www.congress.gov/115/plaws/publ336/PLAW-115publ336.pdf
Connected Government Act, January 2018: https://digital.gov/resources/connected-government-act/ and https://www.congress.gov/bill/115th-congress/house-bill/2331
Required Web Content and Links: https://digital.gov/resources/required-web-content-and-links/
OMB Circular A-11 Section 280, Managing Customer Experience and Improving Service Delivery (2019): https://www.performance.gov/cx/a11-280.pdf
Digital.gov Tools and Services U.S. General Services Administration (GSA): https://digital.gov/services/ and https://digital.gov/services/directory/
Checklist of Requirements for Federal Websites and Digital Services: https://digital.gov/resources/checklist-of-requirements-for-federal-digital-services/
Jeffrey Ryan Pass ⦙ AllCard Sorts Great and Small ⦙ IAC20 ⦙ @jeffpass ⦙ #IAC20
57. End / Thank you
Jeffrey Ryan Pass
All Card Sorts Great and Small
IAC20:The Information Architecture Conference 2020
@jeffpass
#IAC20