WCCC Species Selector: Testing Plan Draft
The test is designed to assess how farmers and agricultural professionals respond to the Western Cover Crop Council Species Selector Tool in its current form. This test will seek ideas, improvement requests, suggestions, and input to inform future planning for iterations to the Cover Crop Selector tool. The purpose of this test is to seek input for refining, rather than redesigning or making major changes or additions to the tool.
The test will focus on assessing the effectiveness of the tool and soliciting feedback for small to medium scale improvements during future development of the application. All tests should be concluded by __________.
Testing Aims
Assess tool’s ability to meet the needs of the diverse western users, including but not limited to difference in regions, scale, experience level, etc.
Collect feedback on the overall effectiveness and helpfulness of the tool (e.g. is the information it provides actionable and helpful).
Check overall usability needs and concerns, to ensure that the tool is easy to use (e.g. is the information provided easy and clear to get to) on whatever platforms best support the respondents needs / use cases.
Identify updates that would be needed to ensure the tool can support the diverse range of use cases for the west.
Understand key utilization differences and needs to support the various sub-regions, climates, and goals within the west.
Key Tester Demographics:
Primarily, we would be seeking to get respondent feedback from the following regions:
Mountain Range
Pacific Islands
Pacific Northwest
Southwest
Within those regions, we would also seek diversity of respondent experience across these need areas:
Irrigated v Non-Irrigated (e.g. 0” - 20”, 20” - 60”, and 60”+)
Perennial v Annual
Within Perennial: Grazing v Perennial Horticulture
Scale / Acreage
Expertise / Experience Level
Additionally, while we will not seek to add respondents particular to these demographic measures, we will ask users to identify:
Level of Comfort with Technology
Platform of Choice (e.g. do they want to do this work on phones, tablets, or desktops)
Goals (what are they coming to the tool seeking to achieve with cover cropping practices)
Testing Roles
Facilitation Coordinator: Edit materials, support materials finalization, run mock testing sessions as needed, assist with tutorial creation, distribute testing materials, communicate with facilitators, remind them of upcoming deadlines, and answer questions.
@Elizabeth Seyler
Testing Lead: Draft materials, design the overall testing approach and strategy, run mock testing sessions as needed, and prepare the training approach.
@Shannon Mitchell (Unlicensed) @Marguerite Dibble (Unlicensed)
Facilitator: A trained individual who showcases the tool for respondents, asks them questions, and records their input at a testing session.
Total Facilitators: ________
Co-facilitator (optional): This individual joins a testing session with a facilitator to support the facilitator by taking notes during the session to allow the facilitator to focus on the group conversation.
Respondent: An individual who represents a user and speaks to their perspectives and needs and to how the tool would impact users. In testing sessions, they share their feedback with facilitators.
Total Respondents: _______
Total Respondents per eco-region:
Timeline → WCCC User Testing
Outline of Materials & Testing Approach
Target Participation:
Each facilitator should test with X# of respondents:
Feedback is due by _____________
Disclaimer on Usage of Data: Answers including your name, state, job responsibility, and perspective on the tool provided via a discussion and completion of a survey will be used to inform tool improvements. These responses will be shared with the development team at Precision Sustainability Agriculture. Responses will not be anonymous so that developers may contact respondents for clarification if needed.
Materials: For Facilitators
Overview Doc that includes:
Timeline and targets
Session format
Links to “Facilitator Form” and “Respondent Survey”
Suggested tester recruitment email
Facilitator Form: A document where facilitators log info about their testing sessions.
Materials: For Respondents
Respondent Survey: A document for recording responses as part of / at the end of the session.
Session Format (one-on-one, 75 minutes long):
If meeting in person, facilitators should request that respondents use the device they would most likely use to access this information as part of their standard workday. It is strongly recommended that sessions be one-on-one, not in groups.
Note: If a facilitator and tester are meeting remotely, respondents should use a desktop computer because screen sharing is part of the testing process and is difficult with a mobile device or tablet.
Introductions & Overview (5 mins): Introduce yourself, ask respondents to introduce themselves (name, location, job title), and record their name(s) in the form. Give an overview of the session, explain why feedback is valuable, and explain how we’ll use the data. Ask for consent to continue.
Respondents Open the Tool (3 mins): Share the link with participants, ask them to open it, and ask them to share their screen so you may see what they’re doing if you’re testing remotely, or ensure you can see what they’re doing if testing in person.
Questions About Pages (55 minutes): Following the script in the Testing Facilitator Form, have respondents use the tool page by page and ask them the questions on the form. Also observe respondents’ reactions and record them on the form.
Discuss General Takeaways (5 minutes): Stop screen sharing or put away the device you were testing with and ask respondents the final questions.
Complete Surveys (5 minutes): Before ending the session, ask respondents to complete the Respondent Survey. Meanwhile, review your notes for accuracy and complete the last section of the Testing Facilitator Form. Once everyone submits their response, thank the respondents for their time and close the session.
Recommended Setup:
Facilitators may test remotely over zoom or in person.
Facilitators should be sure to have a laptop or tablet of their own to utilize the facilitator form.
Many facilitators appreciate having a co-facilitator who can support with note-taking during the session.
Respondent Survey Questions
About You
What’s your name?
What’s your email address (optional) – for developer use.
How did you access the tool (ie. Phone, tablet, computer)?
What organization do you work for? What’s your job title?
How long have you been working in this or a related role?
What’s your state? Which zone are you in? [dropdown]
Need to confirm the measures in this category.
What best describes how you currently use the land? [dropdown]
Need to confirm the measures in this catagory.
How comfortable are you with technology? (5-point scale: “not at all comfortable” to “very comfortable”)
Overall
How intuitive is the current version of upon first use?
It is very intuitive and much easier to use than my current tool.
It is somewhat intuitive and slightly easier to use than my current tool.
It offers essentially the same user experience as my current tool.
It is somewhat confusing and slightly harder to use than my current tool.
It is very confusing and much harder to use than my current tool.
Value
Scale: 1= Strongly Disagree, 5= Strongly Agree
This tool has functionality I value; I would make use of it in my day-to-day work.
What, if anything, would need to change for the tool to be useful in your work?
The tool asks for information in a way that is intuitive and logical.
I understood where the data was coming from in the tool.
I think tool would help me make accurate and informed decisions.
Usability
I found it easy to navigate, I could find what I needed without getting lost or confused.
Terminology was relevant, easy to understand and clearly labelled.
I feel confident I could use this tool efficiently without encountering significant obstacles.
Facilitator Questions for Focus Group Discussion
Ask respondents the following and record their answers…
What are your impressions of the Home page?
Is there anything you were expecting here that you don’t see?
Is there anything that seems confusing or irrelevant to you?
What’s your impression of the Field Location page?
Is there anything you were expecting here that you don’t see?
Is there anything that seems confusing or irrelevant to you?
What’s your impression of the Site Conditions page?
Is there anything you were expecting here that you don’t see?
Is there anything that seems confusing or irrelevant to you?
What’s your impression of the Goals and Cash Crop Growing Window page?
Is there anything you were expecting here that you don’t see?
Is there anything that seems confusing or irrelevant to you?
Does the Crop Calendar view (on the next page) provide the information you need? Please elaborate.
Is there anything you were expecting here that you don’t see?
Is there anything that seems confusing or irrelevant to you?
Does the Crop List view (on the same page) provide the information you need? Please elaborate.
Is there anything you were expecting here that you don’t see?
Is there anything that seems confusing or irrelevant to you?
Does the My Selected Crops view provide the information you need? Please elaborate.
Is there anything you were expecting here that you don’t see?
Is there anything that seems confusing or irrelevant to you?
What other factors, if any, would you want to compare?
What’s your impression of the Cover Crop Information Sheet?
Is there anything you were expecting here that you don’t see?
Is there anything that seems confusing or irrelevant to you?
How best could these sheets support you in your work?
Does the Browse Cover Crops page provide the information you need? Please elaborate.
Is there anything you were expecting here that you don’t see?
Is there anything that seems confusing or irrelevant to you?
How do the factors in the Cover Crop Selector compare to the factors you currently use?
What’s your overall impression of the tool?
Is there anything else you’d like us to know?
Log your takeaways and observations…
How easily was the respondent able to make their way through the tool during testing?
What challenges, if any, did they have?
How did the session(s) go? Is there any other support you would like to have had? (identify challenges or barriers)