Ad Hoc Testing Templates

Shared Components: Ad Hoc Design Testing Template

Ahead of each session, fill out the following info:

  1. Describe the design needs to be addressed:

  2. Describe the available options (no more than three):

  3. Give an example of how the design would be used:

  4. Describe how each design option empowers users:

Kick off the Session: Thank testers for joining. Introduce folks if they haven’t met before.

  • Explain what the tool is (or what tools are).

VEGSPEC: “Thanks for joining! We brought you here to help us test a tool called Vegspec. It’s for helping NRCS agents reach their conservation goals. Using this tool, agents can search a database of plants based on the needs of their specific site, their selected practice, and the planting purpose. In addition, it lets agents generate seed mixes and estimated costs for their projects.”

COVER CROP SELECTOR: “Thanks for joining! We brought you here to help us test a tool called the Cover Crop Species Selector. It’s for helping people figure out what cover crop would work best for their location and goals.”

SEEDING RATE CALC: “Thanks for joining! We brought you here to help us test a tool called the Cover Crop Seeding Rate Calculator. It’s for helping people figure out the best seeding rate based on their location and needs.”

  • If helpful, give an example.

SPECIES SELECTOR: “For example, a farmer looking to plant something in August to prevent erosion and break compaction might find Winter Barley or Winter Cereal Rye could work for them given their particular needs.”

Explain the purpose of testing. “We’re developing a number of decision support tools, and we’re looking for input on our design to make sure that the tools feel as clear and as easy to use as possible. We’re looking for honest feedback to help us improve the tools.“

Check that the tester(s) are comfortable moving forward by getting their consent to join the discussion and take the survey. Explain the following areas (as relevant) and ask whether or not they agree to participate.

  • How data gathered from the session can and will be used

  • What sensitive information will be collected

  • That participation is voluntary -- a chance to opt out or consent

”If you’re comfortable sharing, what is your first and last name and your professional role?”

”I want to communicate that your participation is voluntary. Do you agree to participate?”

Design Demo: Sit down with the participant and demo the design from start to finish. Explain the purpose and the function it serves with an example. Allow participants to interject and ask questions throughout the demo. Pause frequently to prompt for questions and make sure to allow time for this interaction.

When you’ve presented users with a key decision or aspect for feedback, ask the tester to answer the following questions and write down their responses.

  • If there were multiple design options: “Which do you prefer and why?

  • For the favored design: What is your initial impression of the design? What stands out? OR What do you like? What do you dislike?

  • How do you feel about the visuals? Do the look and feel work for you? Why or why not?

  • If content/ information on screen is accurate: Does the design communicate everything you feel you’ll need needed to [complete task]?

  • How easy do you think it would be to use this design to [complete task]? Are there any elements that you think might be difficult to navigate or interact with?”

  • Is there anything about this design that might be difficult confusing for some folks? In what ways?

  • What, if anything seems like it’s missing? Is there anything you expected to be on the page that isn’t?

  • Is there anything else you would like us to know?

Keep choices clear and simple. Users should only be weighing in on one choice at a time. Once users have weighed in on their choice, move on to demoing the next design choice until all key choices have been made.

Complete a survey as testers respond with their feedback.

Thank the testers for their time and help!

 


Ad Hoc Feature Testing

Ahead of each session, fill out the following info:

  1. Describe the feature to be tested:

  2. Describe the purpose of the feature:

  3. Give an example of how the feature would be used:

  4. Describe what the feature empowers users to complete:

Kick off the Session: Thank testers for joining. Introduce folks if they haven’t met before.

  • Explain what the tool is. “Thanks for joining! We brought you here to help us test a tool called Vegspec. It’s for helping NRCS agents reach their conservation goals. Using this tool, agents can search a database of plants based on the needs of their specific site, their selected practice, and planting purpose. In addition, it lets agents generate seed mixes and estimated costs for their projects.”

  • If helpful, give an example. [Add in example here]

  • Explain the purpose of testing. “We’ll test with a wider group in the coming weeks. Your feedback now can help us identify key issues to address before we share it more broadly. The goal is to get your honest impression and input on the tool. “

Check that the tester(s) are comfortable moving forward by getting their consent to join the discussion and take the survey. Explain the following areas (as relevant) and ask whether or not they agree to participate.

  • How data gathered from the session can and will be used

  • What sensitive information will be collected

  • That participation is voluntary -- a chance to opt out or consent

    ”If you’re comfortable sharing, what is your first and last name and your professional role?”

    ”I want to communicate that your participation is voluntary. Do you agree to participate?”

Feature Demo: Sit down with the participant and demo the feature from start to finish. Explain the purpose and the function it serves with an example. Allow participants to interject and ask questions throughout the app. Pause frequently to prompt for questions and make sure to allow time for this interaction.

Ask the tester to answer the following questions and write down their responses.

  • Can you walk me through your initial impression of [feature]? What do you like? What do you dislike?

  • Imagine you're trying to [complete task]. Can you walk me through how you'd use this feature to accomplish that?

  • How easy or difficult does it seem to use this feature? What could improve it?

  • Is there anything about this feature that you find confusing or unclear?

  • Does the feature, screen, or page communicate the information you need needed to [complete task]?

  • What, if anything is missing? Is there anything you were expectating here that you don’t see?

  • Is there anything else you would like us to know?

Complete a survey as testers respond to log their feedback.

Thank the testers for their time and help!