Collecting Data

Collecting Evaluation Data

Collecting Evaluation Data: Knowing Enough About the Types to Decide What's Right for Your Program

Determining which method is best for collecting evaluation data can be involved. Should you use a survey? How about an interview? Would focus groups get at more of the truth about the program? What about on site observations? Each method has advantages and disadvantages. Here are some points to remember when you are working with your evaluator to decide on which techniques to use. 

Surveys

Perhaps you have decided you want a survey included in your evaluation. When creating and administering a survey a couple of points are worth keeping in mind to get the best results. 

As program staff you might create the survey yourself, have your evaluator create it, or create it in partnership. Keep in mind that you are likely to be most familiar with the content you would like included and so you may be asked to put together some ideas of what you would like included. An evaluator who is experienced in creating surveys can take it from there and make sure that the survey meets good guidelines. Whether you are creating the full survey or your evaluator is producing it, there are many issues to be addressed such as those discussed below. 

Questions can be open ended, "What do you like about school?" or closed ended "Which of the following do you like about school?" (with a list of choices). Open ended questions allow for a variety of answers while close ended questions give choices from which to choose. Pick the type of question that best fits the kind of information you are looking for and the amount of time you have to spend reviewing the answers. Open-ended questions take much more time to code and analyze, but can provide more complex information. 

Keep in mind the conditions under which you are administering the survey. How much time will the busy people in your program have to answer your questions? Will they be on their way out the door? Is there a possibility that they could have just had a bad experience? Try and find a time when people will have a couple of minutes to focus on the questions and won't be rushed. This will not always be possible, so it is important to take into account external factors and be as sensitive to them as possible. 

Try and keep the survey to a page or less. The length of the survey almost always affects the number of responses you get back. 

Don't use a mail survey unless the survey is very short, the participants are very committed to the program, or you have the time to carry out lots of nagging follow ups to encourage people to return their surveys. 

Make the survey friendly to readers. Cramming all of the questions together can make the survey difficult to read. Don't use a font that is too small or too hard to read. 

Pilot your survey with five or six people (who are similar to the folks who will fill out the final questionnaire) to make sure that the people interpret the questions in the ways you intended. Nothing is more distressing than finding out after you already conducted the survey that your most important questions were confusing and that the responses don't make sense. 

Pay close attention to the wording of the questions and the order in which they are asked. The questions should flow from topic to topic so that people aren't startled. Avoid ambiguous words. 

Use language that people filling out the survey will find easy to understand. Keep the words simple. If you think you need to translate the survey, you probably do, but make sure that the participants are literate in their written language. 

Observations

Observing a program can be another good way to capture and accurately describe what is going on. Your evaluator can see with his or her very own eyes what is going on and not depend on what is being reported second hand. 

Observations by an independent evaluator are valuable because they can provide a different point of view than those of individuals that are closely related to the program. 

If you decide to do an observation, don't reinvent the wheel. Before you decide to create your own observation system, look to see if there are any that you can adapt. This could save you a lot of time. 

In order to make the observation results credible it is a good idea to use at least two observers that are not related to the program. 

Observations aren't any different from survey and other forms of data collection in that you need to have your categories and areas for study decided before hand. A good evaluator will spend time with you designing the evaluation instrument. The observers should have an idea of what to look for and be familiar with the observation form they will be using. 

Interviews

Talking with someone one-on-one can be a great way to get specific information, especially if the evaluator needs to pay attention to subtle cues in interpreting responses. 

An interview is a great way to gather information from people that find reading challenging (e.g., children, those with sight impairments). 

An interview allows for flexibility in the direction of the interview. While you should have some idea of the questions that you will want to ask, you can be somewhat flexible. You never know what their answer will be and where that might lead you. It might open the door to an issue or idea that you never thought of. 

Beware of the amount of time that it takes to conduct interviews. A survey might take someone five to ten minutes to complete. An interview could take 45 minutes and another 3 hours to transcribe and code. You should take this into account when determining the number of people that you would like your evaluator to reach. 

Bias can be a special problem in interviews. People often are cooperative and will try to give you the answer you want, if you cue them. You don't want to influence the respondent's answers. It is important the interviewee does not feel pressured into giving certain answers.

Focus Groups


Focus groups can be great ways to gather information in short amount of time. In a focus group the evaluator brings together a group of individuals (often 8 to 10) to talk together about a structured set of questions about a program. What are the advantages of focus groups? They can be carried out in a more relaxing environment than the interview, and can offer people the opportunity to bounce ideas off each other. Focus groups can save time because people are all seen at the same time. The disadvantage: you are not getting confidential or anonymous responses about the program. A few strong willed people can influence the result. If you decide to go with a focus group, you will want to make sure that your evaluator has experience in this area. Other pointers: 

They are useful because you can do a content analysis (find common themes about what is and is not working about the program, which types of program participants seem to benefit, what are the barriers that regularly get in the way of the program's success). 

Do a little homework. Before you invite the whole office, find out the personalities of the people that will be participating. 

It is not a good idea to include the boss with the other employees. Depending on what you are trying to find out, the boss's presence will limit what the employees might otherwise say. 

Be aware of group dynamics. Is there one person that has a tendency to talk too much? Do two of the co-workers not get along at all? Develop a set of questions that encourage those who attend to get involved or consider holding more than one focus group. 

While conducting a focus group takes less time than a survey, remember to build in time to analyze the results and possibly allow for some money to have it transcribed. 

Examine Records

As program staff, you may have records that will be helpful to the evaluator such as attendance records, sign-in sheets and permission slips. This type of information can provide detailed accounts about the "ins and outs" of a program. 

Encourage your evaluator to use what is already out there. Let the evaluator know what is there. Data that has already been recorded for some other purpose can sometimes be used for evaluation purposes. Time can be saved by taking a look at what is already documented and either use it in its entirety or pull out useful pieces. 

Watch out for missing data. Even though records are a good source of data, it is important to make your evaluator aware that some information may be incomplete or missing. You might want to check on that before the evaluator begin you analysis. You don't want to waste time by discovering in the middle of the analysis that chunks of data are missing. 

Make sure that you have the needed permission for the evaluator to use the information you find this way. Examining Counselor files take into account legal and ethical issues. Make sure you find out if there are any restrictions. 

by Hilary Lloyd