Validate

Test a design hypothesis.

Card sorting

What

A categorization exercise in which participants divide concepts into different groups based on their understanding of those concepts.

Why

To gain insights from users about how to organize content in an intuitive way.

Time required

15–30 minutes per user

How to do it

There are two types of card sorting: open and closed. Most card sorts are performed with one user at a time, but you can also do the exercise with groups of two to three people.

Open card sort

  1. Give users a collection of content represented on cards.
  2. Ask users to separate the cards into whatever categories make sense to them.
  3. Ask users to label those categories.
  4. Ask users to tell you why they grouped the cards and labeled the categories as they did.

Closed card sort

  1. Give users a collection of content represented on cards.
  2. Ask users to separate the cards into a list of categories you have predefined.
  3. Ask users to tell you why they assigned cards to the categories they did.

Example from 18F

Applied in government research

No PRA implications. The PRA explicitly exempts direct observation and non-standardized conversation, 5 CFR 1320.3(h)3. It also explicitly excludes tests of knowledge or aptitude, 5 CFR 1320.3(h)7, which is essentially what a card sort tests (though in our case, a poor result is our fault).

18F

Multivariate testing

What

A test of variations to multiple sections or features of a page to see which combination of variants has the greatest effect. Different from an A/B test, which tests variation to just one section or feature.

Why

To incorporate different contexts, channels, or user types into addressing a user need. Situating a call to action, content section, or feature set differently can help you build a more effective whole solution from a set of partial solutions.

Time required

2–5 days of effort, 1–4 weeks elapsed through the testing period

How to do it

  1. Identify the call to action, content section, or feature that needs to be improved to increase conversion rates or user engagement.
  2. Develop a list of possible issues that may be hurting conversion rates or engagement. Specify in advance what you are optimizing for (possibly through metrics definition).
  3. Design several solutions that aim to address the issues listed. Each solution should attempt to address every issue by using a unique combination of variants so each solution can be compared fairly.
  4. Use a web analytics tool that supports multivariate testing, such as Google Website Optimizer or Visual Website Optimizer, to set up the testing environment. Conduct the test for long enough to produce statistically significant results.
  5. Analyze the testing results to determine which solution produced the best conversion or engagement rates. Review the other solutions, as well, to see if there is information worth examining in with future studies.

Additional resources

Applied in government research

No PRA implications. No one asks the users questions, so the PRA does not apply. See the methods for Recruiting and Privacy for more tips on taking input from the public.

18F

Usability testing

What

Observing users as they attempt to use a product or service while thinking out loud.

Why

To better understand how intuitive the team’s design is, and how adaptable it is to meeting user needs.

Time required

30 minutes to 1 hour per test

How to do it

  1. Pick what you’ll test. Choose something, such as a sketch, prototype, or even a “competitor’s product” that might help users accomplish their goals.
  2. Plan the test. Schedule a research-planning meeting and invite anyone who has an interest in what you’d like to test (using your discretion, of course). Align the group on the scenarios the test will center around, which users should participate (and how you’ll recruit them), and which members of your team will moderate and observe. Prepare a usability test script (example).
  3. Recruit users and inform their consent. Provide a way for potential participants to sign up for the test. Pass along to participants an agreement explaining what participation will entail. Clarify any logistical expectations, such as screen sharing, and pass along links or files of whatever it is you’re testing.
  4. Run the tests. Moderators should verbally confirm with the participant that it’s okay to record the test, ask participants to think outloud, and otherwise remain silent. Observers should contribute to a rolling issues log. Engage your team in a post-interview debrief after each test.
  5. Discuss the results. Schedule a 90-minute collaborative synthesis meeting to discuss issues you observed, and any questions these tests raise concerning user needs. Conclude the meeting by determining how the team will use what it learned in service of future design decisions.

Example from 18F

Additional resources

Applied in government research

No PRA implications. First, any given usability test should involve nine or fewer users. Additionally, the PRA explicitly exempts direct observation and non-standardized conversation, 5 CFR 1320.3(h)3. It also specifically excludes tests of knowledge or aptitude, 5 CFR 1320.3(h)7, which is essentially what a usability test tests. See the methods for Recruiting and Privacy for more tips on taking input from the public.

18F

Visual preference testing

What

A method that allows potential users to review and provide feedback on a solution’s visual direction.

Why

To align the established branding guidelines and attributes of a solution with the way end users view the overall brand and emotional feel.

Time required

4-12 hours for style tiles. 30 minutes per participant to get feedback.

How to do it

  1. Create iterations of a style tile that represent directions a final visual design may follow. If branding guidelines or attributes don’t exist, establish them with stakeholders beforehand.
  2. Interview participants about their reaction to the style tiles.
    • Ask questions as objectively as possible.
    • Align questions with the branding guidelines and attributes your project must incorporate.
    • As far as possible, allow participants to provide their feedback unmoderated or at the end of your research.
  3. Compare the results of your research with the agency’s published branding guidelines and attributes.
  4. Publish the results to the complete product team and decide which direction will guide future design efforts.

Additional resources

Applied in government research

No PRA implications. The PRA explicitly exempts direct observation and non-standardized conversation, 5 CFR 1320.3(h)3. See the methods for Recruiting and Privacy for more tips on taking input from the public.

18F