As you start thinking of rolling out use cases in your organization, you might think about choosing a use case based solely on your assumptions. Though it is tempting, we recommend that you take time to prepare and choose the use case to prioritize based on solid research. After all, well-begun is half-done! One great way to research your end users is to send them a user survey.
User surveys are an effective tool that helps you broadly understand the end users in the organization. They provide you with the ability to zoom in on the areas that stand out and dig deeper into them by conducting further 1:1 discussions with representative users.
Best practices
Define a clear goal for the survey
What does a clear goal look like? Let's use an example. Say you want to understand different end users in your organization. Instead of a goal like "I want to better understand end users in the organization", your goal should be something like "I want to better understand the data-related challenges of end users in the organization".
You can get more specific and ask:
- What are the broad categories of users in the organization and what are the key factors that lead to this segmentation?
- Is it the tools they are leveraging, the data-related activities they perform, the time they spend on those activities or the teams they collaborate with, etc?
Focus on using closed-ended questions
Closed-ended questions are ones that use pre-populated answer choices for the respondent to choose from β like multiple choice or checkbox questions. They are a great way to ensure that respondents interpret questions the same way.
Open-ended questions, on the other hand, ask the respondent for feedback in their own words. They allow the greatest variety of responses but are time-consuming to ask and require a lot of work to analyze. Respondents are also likely to skip open-ended questions, so try to only include 1-2 of them in your survey.
Keep your answer choices balanced
Remember that since you are asking respondents to choose from a few options, it is crucial that you are not biasing your options.
For example, let's say our prompt is, "How do you find the process of locating the right data?"
Here's how a set of unbalanced answer choices (that lean towards being too positive) can look for that question:
- Very easy
- Easy
- Neither easy nor complex
And here's how they'd look once balanced:
- Very easy
- Easy
- Neither easy nor complex
- Complex
- Extremely complex
Don't ask double-barreled questions
Double-barreled questions are when you ask for feedback on two separate things within a single question.
Here's an example: "How do you find the process of locating the right data and understanding the context behind it?"
How would the respondent answer this question? Should they address the issue of locating the right data, or should they consider the issue of understanding the data? These questions confuse respondents and drop the quality of your survey.
Don't let your survey get too long
You probably know from your own experience that we tend to abandon long surveys. The same principle applies when you are designing a survey for another user. Don't create a long questionnaire. Your reward for keeping it short (less than 5 minutes) will be a higher organization-wide completion rate.
Another good way to increase completion rate is to schedule 10 minutes in your team meeting for everyone to fill out the survey while they are on the call. This has two benefits:
- Respondents get an opportunity to ask clarifying questions,
- and you end up with enough responses for analysis.
To help you get started, we have created a user survey template. Feel free to customize it for your goals.