An evaluation plan should be an integral part of your overall written plan for a quality reporting project. To support the planning of an evaluation, this page covers the following topics:
To clarify the purpose of your evaluation, start by identifying what you need to learn in the short and long term. Think specifically about the decisions you and your partners are facing and when they have to be made. Key issues include:
Since your resources are sure to be limited, answering these questions will help to set priorities for learning.
Evaluations are most useful when they inform key decisions by answering the right question at the right time. What specific questions do you need to answer to adequately inform your decisions? Note that you may have several questions and that different questions may be appropriate to ask at different stages of your effort. For example, you will probably need answers to questions about your process sooner than you need answers to questions about results.
The number of questions you can address depends largely on the time and resources available. It also depends on whether you can save money by using the same data collection methods to gather the answers to more than one question at a time. For example, you might use a single community survey to address questions about whether your audience was aware of the report, sought it out, or used it. But this type of survey probably wouldn't work for determining whether people understood the report.
To properly evaluate your efforts, develop specific criteria for success. Here are some issues to consider:
The credibility of your evaluation with various stakeholders will depend in part on whether you define success in a way that resonates with them. They may have different points of view about the most important criteria for success. Make sure you get their input and come up with a clear set of criteria that reflect a shared vision. You might find that clarifying your criteria leads to useful, if sometimes thorny, discussions about exactly what you are trying to achieve, for whom, in your initiative.
Planning your evaluation as early as possible makes it easier to start on your assessment when you want to. People who start late often find themselves playing "catch up" and cannot actually get the information they need.
As early as possible, decide when you will start work on collecting feedback. If you are evaluating your processes, you need to move quickly to gather the data you need. If you are evaluating your results, you may also need to start early if you hope to collect data on the situation before your report is issued. This information is often called baseline data.
However, even if you are well along in your efforts, and have not been able to focus on evaluation yet, you can and should start as soon as possible. If you are in this for the long haul, you need to harness evaluation tools to help the project move forward in the right direction as you get more sophisticated and perhaps more ambitious.
How will you measure whether each of your criteria has been met? When you're thinking about what data to track, keep in mind that the things that are easiest to count are not necessarily the most informative. For instance, the number of reports mailed out to enrollees doesn't tell you whether they read it, understood it, or used it.
When you develop your plan, answer these questions as well:
One important thing to consider is whether you are collecting data on individuals or groups/organizations:
When you collect data about groups or organizations, you are typically collecting the data from individual people in the group or organization who are knowledgeable about the group or organization in question. These people are sometimes called "key informants."
How will you collect data on your measures? You are likely to be using a mix of qualitative and quantitative methods in your evaluation as well as perhaps tapping into existing data, especially if you are evaluating a Web-based report.
This page provides brief descriptions of several useful data collection methods for evaluating public reports. The method you use depends on the question you are asking as well as the time, resources, and talent that you have available. You must also consider what will be credible to the audience for your evaluation findings.
A key decision in any evaluation is what data collection method to use to answer your evaluation questions. Here are some examples of how to fit a data collection method to a question. You may need to use multiple methods to address all your important questions.
If you are evaluating process:
Possible Qualitative Methods
Possible Quantitative Methods
Did you engage with the right partners at the right time?
Interviews with partners and staff
Survey of partners
Did the measures you chose resonate with your audience?
Focus groups with sample of audience member
Surveys of audience members
Did your audience find your Web-based report?
Focus groups with sample of audience members
"Web analytics" to track usage of the Web site
Surveys of audience members
Did you get the media attention you wanted?
Tracking of media mentions (by yourself or through a service)
Did your outreach efforts reach those who are less literate?
Interviews with outreach partners
Focus groups with less literate community members
Note: A survey would probably not be appropriate for a less literate group, unless it was done by telephone.
If you are evaluating results:
Possible Qualitative Methods
Possible Quantitative Methods
Did your audience understand the report?
Usability testing after the report has been issued
Survey of audience members including questions to test their knowledge of key facts and messages in the report
Did your audience use the report? How?
Focus groups with samples of audience members
Survey of audience members
Observed changes in enrollment in health plans or use of providers (very difficult)
How did providers and plans respond to the report?
Interviews with individual plans and providers
Survey of plans or providers
Did the reports intensify quality improvement activities?
Interviews with individual plans and providers
Survey of plans or providers
With what results?
Changes in plan or provider performance over time on key metrics, including but not limited to those in your report
A survey asks a systematic sample of a population a set of questions that they answer using a specified set of responses. The sample population could be community members (including those you hope to reach), people who actually use reports, or representatives of purchasers, providers, plans, or policymakers.
Surveys ask a series of questions that can be closed-ended (where a limited set of answers is provided for each question) or open-ended. The use of closed-ended questions means that survey results are quantifiable.
Surveys may be administered by mail, by telephone, in person, or over the Web. Some Web sites incorporate a survey “feedback” function that asks questions and solicits comments from site visitors.
What's needed for surveys?
In a focus group, a small group of individuals spends 1 to 2 hours in a guided discussion of a small set of questions. The individuals typically have certain characteristics in common, but they may also be diverse on other characteristics.
Unlike questions on surveys, the questions asked in focus groups can be answered in any way that the participants choose. No predetermined answers are provided.
The interaction among participants and how they influence each other are both part of the "data" that is of interest. In some focus groups, participants complete a brief survey at the beginning to capture their demographic characteristics or other information. In others, participants respond to a stimulus provided by the moderator.
What's needed for focus groups?
A key informant interview focuses on a single individual or a very small group of individuals who are chosen because they:
One or two interviewers ask the key informants a set of "open-ended" questions that permit respondents to say what they want in their own language. These interviews can be conducted in person or by telephone.
In some cases, interviews are highly structured: questions are asked in the same order, with the same wording, of everyone. Semi-structured interviews are more common; in such interviews, interviewers can reword the questions to fit the situation and change the order of questions. In all kinds of interviews, one can use "probes" (either specified ahead of time or identified during the interview) to delve deeper into a topic or issue.
What's needed for interviews?
With the growth of the Internet has come a parallel growth in methods to assess how and by whom a given Web site is being used. Analytics can also indicate whether links or ads you have placed to let people know about your report are actually being used. These methods are carried out by private companies, sometimes for a fee. Certain search sites, for example, offer free Web analytic services.
Several kinds of tools are available for evaluating your project, including interview protocols, surveys, and focus group moderator guides. The tools you need and the activities you carry out depend on your data collection methods. When you are collecting primary data, you typically have to develop tools specifically for your situation.
When you develop your plan, answer these questions:
Analysis methods vary by how you collect the data. Quantitative data require typical statistical analyses. Be sure you have the expertise and the software required to conduct these analyses.
The analysis of qualitative data is less familiar to most people, but there are systematic and rigorous ways to analyze transcripts from interviews and focus groups. Qualitative analyses of the content of these transcripts are used to identify themes, patterns, and variations across different kinds of respondents.
When you develop your plan, answer these questions:
Over the years, evaluators have learned that how, when, and to whom they report their findings has a big influence on whether the results ever get used. Just as you need to be very aware of your audience in designing and distributing a quality report, you have to be clear about the audience(s) for your evaluation results.
When you develop your plan, answer these questions:
Also in "Assess Your Reporting Project"