With user testing we want to be able to answer the question: Does [TOOL] provide [VALUE] to [AUDIENCE]?
For this project, we define those variables as:
[TOOL]: Seeding Rate Calculator, Mix Maker
[VALUE]: The tool is empowering, easy to use, and provides accurate information.
[AUDIENCE]: Audience in the collective below will be referred to as “users” and include the following groups
Farmers
Extension Agents
NRCS Staff
Crop Consultants / Conservation District Agents
GOALS:
OUR GOAL is that through testing, we will be able to assess if the Seeding Rate Calculator empowers the user to calculate seeding mixtures and generate seeding rates. We can apply the following user stories to gather data on how well the tool achieves this objective:
...
Users feel empowered in their ability to make a seed mixture.
...
Users feel empowered in their ability to calculate a seeding rate based the mixture they are interested in.
...
Users can easily modify an existing seed mixture to better meet their goals.
...
Users calculations equip them to take next steps towards their goals.
...
Extension Agents or NRCS staff feel empowered to evaluate mixtures created by someone else (e.g. a seed company, farmer, consultant, etc.)
...
Introductions / Demographics
What region are you located in?
What is your job?
What is your level of experience using digital applications (e.g. mobile apps, web apps, desktop apps)? [low, mid, high]
What is your level of knowledge with cover crop seed mixtures? [low, mid, high]
What is your level of knowledge with calculating seed mixtures? [low, mid, high]
Empowerment Questions
Do you feel like the tool gave you what you needed to achieve your goal? [Yes or No]
For the above, what do you feel like the tool had that helped you do this or didn’t provide that would have helped you do this? [free response]
Do you feel empowered after using the tool to make a seed mixture _____? [Scale of 5]
[Extension Agents or NRCS staff] Do you feel equipped to support growers in exploring and confirming seeding mix options.The tool increases motivation for ? [Scale of 5]
[Extension Agents or NRCS staff ] Would this make you more or less likely to recommend cover crop mixtures.
The tool increases motivation for farmers to use cover crop mixtures.
The above is true no matter which region or what conditions exist where someone is located.
OUR GOAL is that through testing, we will be able to assess if the Seeding Rate Calculator feels easy to use for the target audience as they calculate seeding mixtures and seeding rates. We can apply the following user stories to gather data on how well the tool achieves this objective:
Users understand what to do to make their way through the tool.
Users understand the results they get from the tool.
It is easy for Extension Agents or NRCS staff to evaluate a mixture ?
Ease of Use Questions
Did you understand how to make your way through the tool?
How easy or hard was it to understand the results you got from the toll? [Scale of 5]
What about it was easy or hard?
Could you easily modify an existing seed mixture to meet your goals? [Scale of 5]
Were you able to determine if a mix created by someone else (seed company, farmer, consultant, etc.) It is easier for users would meet your goals?
What, if any, other tools do you currently use for calculating seeing rates?
How easy did it feel to use this tool when compared to existing seeding rate and mix tools.
The above is true no matter what level of technology experience someone has.
OUR GOAL is that through testing, we will be able to assess if the Seeding Rate Calculator provides accurate information to support the work of the target audience. We can apply the following user stories to gather data on how well the tool achieves this objective:
The tool provides information that is accurate to a user’s location.
Farmers feel the tool provides ?
Accurate Information
Did the tool provide information that was accurate to your location? [Yes or No]
Why or why not [Free Response]
Did the tool provide a seeding rate that fits their your needs and resources.
Is there a high or low of what you’re typically using?
Farmers feel the tool provides seeding mix options that fit their needs.
Are there species missing? Does each species have enough info?
Extension Agents or NRCS staff feel the tool provides a seeding rate is a good fit for the needs and resources of the growers they support.
Extension Agents or NRCS staff feel the tool provides seeding mix options that are a good fit for the growers they support., or those of a grower you support?
What rates would you typically recommend or use for a mix like this?
Are there species missing? Does each species have enough
Users can check the accuracy of a recommended mix or seeding rate.
Users trust the information they get from this tool.
Testing Approach:
How would the testing be conducted in order to achieve our goals?
Async Testing Approach
Respondents are provided with a Testing Materials packet.
The packet will include information on where and how to access the tool.
The packet will contain the post-testing survey that respondents will complete.
The packet will provide instructions on how respondents can report a bug / what to do if they encounter a bug during testing.
The packet include contact points for who to reach out to with questions or if a respondent is having issues.
Respondents try out the tool based on guidance in the materials including example scenarios to explore.
Respondents complete the post-testing survey.
They survey will ask what region the respondent is from, their level of experience with seed mixtures, and their level of comfort with technology.
The survey will only include questions relevant to that respondent (e.g. agents won’t be asked questions about being a farmer).
The survey will ask respondents to score relevant user stories listed above on their level of success. Some of the above user stories will be broken into multiple questions to ensure clarity and relevance of answers (for example, the survey should include questions that allow respondents to rate the ease of use for many different aspects of the tool).
The survey will include several AB (or ABC) options for look and feel / user-experience adjustments to the tool. Respondents will be asked to choose which option they prefer and provide additional information / context on why that was their choice.
The survey will ask if the respondent is available for a short interview on their experience if the dev team determines such a conversation would be helpful.
Optional: The Development Team may reach back out to a respondent who has completed the survey and opted-in to interviews in order to discuss about their experience and ask clarifying questions.
Testing feedback is collected, scored, and organized by the Testing Managers (GameTheory).
Testing Managers (GameTheory) provide a report that scores each user-story’s level of success and notes the factors that led to that score based on the testing feedback.
Async Testing Participant Needs:
A minimum of 5 respondents from each relevant geographical area.
A minimum of 10 farmer respondents and 10 agent / consultant respondents.
At least 5 respondents with each low, mid, and high levels of technical experience.
At least 5 respondents with each low, mid, and high levels of seed mixture experience.
Note: an individual may fit multiple categories (e.g. an extension agent from the midwest region who has high technical experience and mid experience with seed mixes can be one of the respondents in each of these categories)
1-on-1 Testing
Facilitators are provided with a packet with the information and materials they need to run a testing session.
The pack will include information on where and how to access the tool.
The pack will include AB test options and visual references
The packet will contain the post-testing survey that respondents can optionally complete.
The pack will contain a form / database for recording feedback and observations.
Facilitators schedule an hour long meeting with the respondent.
(15 mins) The facilitator will meet with the respondent, and let the respondent try out the tool. Facilitators should encourage respondents to explore on their own and try out options without extra help. Observe and note where they’re struggling or what comes easily to them.
(15 mins) The facilitator will ask the respondent a series of questions to assess whether the user stories are true or not for the respondent. The facilitator will record the respondent’s answers.
(30 minutes) Show the respondent two-three approaches to a particular aspect of the tool such as user experience, or look and feel. Ask them questions to prompt them to compare and contrast the options, and select their favorite using the prepared questions that are packaged with the visual references. Facilitators will record the feedback.
Optional: Respondents fill out the survey [Async Survey].
Testing feedback is collected, scored, and organized. Testing Managers (GameTheory) will provide a report that scores each user-story’s success and notes factors that led to that score, as derived from the notes and observations. GameTheory may follow up to clarify feedback or to debrief with the Facilitator on their findings.
1-on-1 Testing Participant Needs:
2-3 respondents from each relevant geographical area.
1-2 farmer respondents and 1-2 agent / consultant respondents.
1-2 respondents with each low, mid, and high levels of technical experience.
Note: an individual may fit multiple categories (e.g. an extension agent from the midwest region who has high technical experience and mid experience with seed mixes can be one of the respondents in each of these categories)
Potential Supporting Questions for Testing
...
information for you?
How did you feel about how the results were presented? [Free Response]
How did you feel about the amount of information that was presented in the Results section? [ Scale of 5 - too little, too much]
Was anything especially confusing or helpful? [Free Response]
Did you get the information you were looking for? [Scale of 5]
How would you rate the trust you have for the information the tool provides? [Scale of 5]
Supporting Questions
What did you like about the tool? What didn’t you like?
Did using the tool increase your knowledge of the function, benefits and use of cover crop mixtures
...
?
Does the tool help users see a ROI by getting creative with how they’re planting, whats in the mixes, when they’re planting?
What type of characteristics users will be looking for in plant materials (example: soil erosion & N fixation)?
How do users currently get their seeding mixes? For example do you take soil tests in the planning phase.
...
How easy is it to adjust seeding rates?
...
Is it easy to understand the seeding rates based on how they look in the tool?
...
How else can the tool give users an accurate expectation of what a seed mixture will look like in the field?
...
How would you describe the quality of information the tool provides?
Low Quality | Moderate Quality | High Quality | Exceptional Quality
How would you describe the overall value of the information the tool provides?
Low Value | Moderate Value | High Value | Exceptional Value
Could you see yourself using this tool?
Yes | No | Other
Async Questions Bank
Users can check the accuracy of a recommended mix or seeding rate. → (Async) Based on what the seeding rate calculator showed you, was the premade mix accurate?