Aysha Bilal is a UX Researcher and sugary-coffee enthusiast living in the San Francisco Bay Area.
Card+sorting+Png+with+white+background.jpg

Card Sorting Method

A case study of how we used this method to validate the direction of what content would surface on a specific dashboard in the product

[Disclaimer: per a confidentiality agreement I signed at Google, I am unable to share any specifics about the product, data or findings, so this case study will only show my research process.]

 

The Problem

This Google product team wanted to validate the direction of what content is surfaced on a particular dashboard vs another part of the product.

This content would provide a useful overview for advertisers.

Through this study, we wanted to qualitatively evaluate the hierarchy and usefulness of that content, as well as gather information that is relevant to advertisers about a particular part of that user flow.

 
 

The Research Process

Screen Shot 2021-03-24 at 12.39.56 PM.png

This is the research process that I follow for executing my studies. Through this case study, I will walk you through each of these phases in detail.

[None of the specific study findings will be shared.]

Overall, I unpacked a research question the team needed to address, then scoped, planned, and executed the study, as well as analyzed and presented the findings. I will conclude with impact.

Team: 2 UX Designers, 1 Product Manager, 3 Engineers, and 2 Instructional Designers

Tools: Google Workspace, Figma, Optimal Workshop, and Mural

 

Identify Problem

Collaborated with the stakeholder team to understand research objectives

What I did:

  • Read through background information

  • Facilitated discussions with product and design teams to align on objectives and learn about specific content

Main questions for stakeholders:

  • How might this content may improve the user’s experience?

  • Clarifications on the design brief or content

 

Scope

Scope of the study was discussed with the project stakeholders and the Rapid Research team Program Manager to fit within the rapid, 3 week timeline

What I did:

  • Facilitated discussions with product and design teams to align on key areas of interest, as well as prioritize and consolidate research questions

Main questions for stakeholders:

  • What decisions will this study help with?

  • What are we trying to validate?

  • How are we measuring user success?

 

Plan

60-minute, remote sessions utilizing the closed card sorting method and traditional interview questions to evaluate the hierarchy and usefulness of the content

N = 12 advertisers

A mix of small, medium and large businesses, as well as various product experience levels.

The content on the 21 cards included images of the tiles that would appear on the dashboard.

Closed card sort predefined categories were: critical / most useful, very useful, somewhat useful, not useful, and not clear / confusing.

Logistics:

  • Created study materials: study plan, study screener, study script, card sorting exercise on Optimal Workshop, stakeholder observations note taking form, and data analysis sheet

  • Collaborated with cross-functional teams for participant recruitment

    • Prioritized screening criteria with project stakeholders

    • Worked with recruitment and engineering teams to pull lead lists

 

Execute: Session Flow

Screen Shot 2021-03-24 at 1.59.34 PM.png
 

Analyze

Screen Shot 2021-03-24 at 1.21.19 PM.png

Popular Placement Matrix

Screen+Shot+2021-03-24+at+1.27.23+PM.jpg

Results Matrix

Screen Shot 2021-03-24 at 1.36.27 PM.png

Throughout the sessions, I debriefed the stakeholders daily and coded preliminary findings.

During this check in process, we were able to determine any iterations for the script and cards for future sessions.

After the sessions were complete, I coded and synthesized all of the findings.

 

The tool we used for card sorting, Optimal Workshop, helped to streamline the analysis.

These matrices help to answer some of the top research questions (i.e. how are participants rating the content’s usefulness?).

 

Affinity mapped participant feedback

To determine themes around what participants found confusing, thought was missing, or any recommendations they had to make the cards more useful for them.

 

Impact

Delivered valuable, user-centered insights to product stakeholders and Rapid Research team leads

“Thanks for driving this study! Really interesting findings that will be useful well beyond Q1!

  • Product Manager for the product


“Thanks, Aysha! Really great to see the team taking a user-centered approach to define the [product].”

  • UX Design Manager on the Analytics, Insights and Measurement Team

 

Final Thoughts

Lessons Learned:

  1. Analysis tools can provide some quantitative data to help backup the qualitative findings.

  2. Card sorting exercises require some trial and error, and it is important to schedule some time with the project team to refine the process for participants.

  3. Many participants found 1 card specifically confusing. I wonder if iterating this card in between sessions would have gotten us differing data.

 

*Image credits: Some images were discovered through Google image searches: the first image of card sorting was found on uxplanet.org, and the Google Logo was found through a logo sharing site. The rest of the images were taken by me.