Assessment Resources

Assessment Resources


Asking good survey/interview questions


I. Reliable and Valid
Good questions are reliable and valid, which means they should be a consistently measured in comparable situations and the answers should coincide with what you intend to measure.

Click to see more
  1. Reliability: The question should be universally understood and mean the same thing to every respondent.
    • This may require defining terms/concepts clearly.
    • Ask only one question at a time. Avoid asking two questions at once. This is called "double barreled questions"
    • Design both closed and opened questions to gain a complete view of respondent’s thoughts and opinions.
  2. Validity: refers to how well the question measures what you intend to measure, the relationship between the answers and the true measure of the score. Your findings represent the phenomena you intend to measure.
II. Measure Facts
Questions should measure facts, as well as, subjective states such as attitudes, opinions and feelings. Through a survey you are essentially trying to show a predictable relationship to what are facts and the subjective states that are of interest.

Click to see more
  1. Vary the types of questions
    • Open ended: allows for the survey taker to express additional information in their words that you would not have anticipated. This is helpful to gain insight to unexpected information.
      1. How has your membership or worship attendance changed over the last five years?
    • Closed: typically provides a list of acceptable responses.
      1. Yes/No questions: "Are you currently employed"
      2. Multiple choices – please select one option - Example "Are you currently employed: full time, part time, or seasonally/temporary?"
      3. Multiple choices - selects all that apply - Example "As a currently employed individual which of these issues are of most concern to you? - Provide a list of concerning issues for respondents to select from."
      4. Likert Rating scale – Likert Sample Scales: indication of a persons level of agreement with a statement based on a defined scale
      5. Ranking: the order of importance that influence the answers/decision
    • Combination of both: a multiple choice questions where respondents select one answer. Add the category of "other" where survey takers can expound or define an answer that is not listed into their own words.
Fowler, Floyd J., Jr. "Designing Questions to Be Good Measures." Survey Research Methods. 5th ed. Los Angeles: Sage Publications, 2014. 76+. Print.

Evaluation Development Worksheet
For more information on assessment download the Assessment 101 PDF


SURVEY BEST PRACTICES

  1. Plan out the survey process before designing the survey. Define the objective.
    1. What question are you trying to answer?
    2. Who are you sending the survey to?
    3. Sample Size – how many responses do you need in order to gain an accurate estimate of the target groups attitude and feelings?
    4. The timeline – leave the survey open for a long enough time to increase responses from a variety of respondents
  2. Beginning with your objective, decide what questions and/or information you need to compile to indicate you have reached your objective.
  3. Design/format the survey to match your brand. Use your logo to make it visually appealing
  4. Ask one question at a time!
  5. Group your questions by topic/on separate pages.
  6. Remove bias!
    1. Attempt to develop objective questions that do not lean towards one way or another. Avoid including your own opinions into the questions which, can bias the answers.
  7. Keep your survey short and to the point in order to get the best response rate.
  8. Formulate more closed-ended questions than open-ended. Closed-ended questions are easier for the respondent to answer relatively quickly, as well as, provide you with easily quantified data. Open-ended questions require re-coding and identification of trends, which can take longer to process.
  9. The first questions should engage the respondent and be easily answerable. Leave personal/demographic questions until the end. Individuals will be more likely to answer more sensitive questions after responding to the bulk of the survey.
  10. Provide an introduction to the survey. Tell the survey takers why you are asking them this set of questions. Give them an estimated completion time, as well as, what will be done with the data once it’s collected.
  11. Send your survey for a test run. Preview the survey!
    1. Make sure you get feedback from the test survey takers. You want to know that everyone understands each question in the same way. That all survey design features are working properly i.e. skip logic, randomizations, and question order.
  12. Share the results and findings with the survey takers so they can be aware of what you’ve learned. This helps build trust and validation by ensuring they understand their input makes a difference.

http://www.constantcontact.com/aka/docs/pdf/Top10SurveyBestPractices.pdf

Fowler, Floyd J., Jr. "Designing Questions to Be Good Measures." Survey Research Methods. 5th ed. Los Angeles: Sage Publications, 2014. 10. Print.

SURVEY MONKEY BEST PRACTICES

Best Practices for every step of survey creation – Offers three sections with links pertaining to the overall topic: How to design a survey, how to get survey responses and how to analyze survey results

  1. Survey Design
  2. Survey Response
  3. Survey Analysis