Assessment Resources

Asking good survey/interview questions


Good questions are reliable and valid, which means they should be consistently measured in comparable situations and the answers should coincide with what you intend to measure.

  1. Reliability: The question should be universally understood and mean the same thing to every respondent.
    • This may require defining terms/concepts clearly.
    • Ask only one question at a time. Avoid asking two questions at once. This is called a “double-barreled question”
    • Design both closed and opened questions to gain a complete view of the respondent’s thoughts and opinions.
  2. Validity: refers to how well the question measures what you intend to measure, the relationship between the answers, and the true measure of the score. Your findings represent the phenomena you intend to measure.


Questions should measure facts, as well as, subjective states such as attitudes, opinions, and feelings. Through a survey, you are essentially trying to show a predictable relationship to what are facts and the subjective states that are of interest. You can do this by varying the types of questions

  • Open-ended question: allows for the survey taker to express additional information in their words that you would not have anticipated. This is helpful to gain insight into unexpected information. Example – “How has your membership or worship attendance changed over the last five years?”
  • Closed: typically provides a list of acceptable responses. Example – Yes/No questions: “Are you currently employed?”
  • Multiple choices – please select one option – ExampleAre you currently employed: full time, part-time, or seasonally/temporary?”
  • Multiple choices – selects all that apply – Example “As a currently employed individual which of these issues are of most concern to you? – Provide a list of concerning issues for respondents to select from.”
    • Likert Rating scale – Likert Sample Scales: indication of a persons level of agreement with a statement based on a defined scale
  • Ranking: the order of importance that influence the answers/decision
  • A combination of both open and closed questions: a multiple-choice question where respondents select one answer. Add the category of “other” where survey takers can expound or define an answer that is not listed into their own words.

Fowler, Floyd J., Jr. “Designing Questions to Be Good Measures.” Survey Research Methods. 5th ed. Los Angeles: Sage Publications, 2014. 76+. Print.


Evaluation Development Worksheet
Survey Best Practices
For more information on assessment download the Assessment 101 PDF


Church Assessment Resources

United Church of Christ’s Be the Church: A Mission Planning Guide for Congregations– Includes a Congregational Assessment Tool and Planning Worksheets: Free booklet for download.
Hartford Institute for Religion Research – Congregational Assessment Inventories: Online assessment for members costs $300 and includes a report of results.
Evangelical Lutheran Church in American (ELCA) – Congregational Vitality Assessment Tools and Resources: Free assessments and interpretive resources available for download.


Evaluation Design Resources

Designing “good” Questions
Are you asking the right questions? These guides can help you avoid common pitfalls that result in bad data.
Creating Good Interview and Survey Questions
5 Common Survey Question Mistakes

Planning Outcomes-Based Evaluations
Not sure how/when to evaluate programmatic impact and outcomes? Start here.
Basic Guide to Outcomes-Based Evaluation for Nonprofit Organizations
Performance Measurement and Improvement
A Step by Step Guide to Monitoring and Evaluation
Project/Programme Monitoring and Evaluation (M&E) Guide
Handbook on Planning, Monitoring and Evaluating for Development Results

Focus Group/Interviewing Resources
When should I conduct a focus group? Step by step guides to walk you through the process of planning all the way through analysis.
Focus Group Interviews: Quick Tips
Data Collection Methods for Program Evaluation: Focus Groups
Using Focus Groups for Evaluation

Mission Planning Tools

Logic Model – Instructions and a blank template for download (Excel)


Evaluation Analysis Resources

How to Analyze Survey Results with Survey Monkey
Statistical Significance: Find Out if the Results You Received Are Statically Significant
Using Compare Rules to Cross-Tabulate Results
Filtering by Open-Ended Comments or Text


Evaluation Report Resources

Components of the Evaluation Report