Six Essential Steps for Evaluating Your Chorus’s Education Program

Your chorus has created and launched an exciting education program. How will you measure its effectiveness? Kimberly Meisten of VocalEssence and David Myers of the University of Minnesota School of Music presented some winning ideas at the 2012 Chorus America Conference.

1. Establish a focus for your evaluation. 

Consider: What do I really want to learn about this educational program? “A chorus may know it is fulfilling one part of the goal of an educational program, but not be so sure about another part of the goal,” Meisten said. “For example, you might want to know if you are reaching the audience you want to reach. Or you may need to know if students are learning what you want them to learn. Those are two very different questions. And you would use very different surveys and tools to find out the answers.” 

Also you don’t want an evaluation that asks about things that it is not possible to change. “If, for example, it turns out that pricing is a barrier,” Meisten said, “do you really have the ability to address that or not?” If you don’t, then maybe you shouldn’t ask the question.

Remember, you don’t need to evaluate every program or project of your chorus simultaneously. “No one has the staffing or money to do that,” Kimberly Meisten noted. “And you don’t need to evaluate everything every single year.” 

Also, pare down the questions you want to ask. “You don’t need to ask 500 questions on a survey when you can really get at what you want to know with three,” Meisten said. “Be realistic about what your organization can do.”

2. The purpose of an evaluation is to understand and/or improve programs, not to prove that they work. 

“Often times you want to prove that a program works, because you want to show your funders that you are successful,” Meisten said, “but funders are interested in seeing what worked and what didn’t. If it is a learning experience for you, they are not opposed to hearing that.”

Be careful that your evaluation is objective and permits a complete continuum of responses. So if you ask, “What was the best part of the program for you,” be sure to also ask, “What would make this a better experience for you next time.”

“You don’t want a biased survey,” Meisten said. “That doesn’t do anyone any good. You want to find out what is working or not working so you can make it better.” 

3. Determine what can be done internally, at minimal cost.

Find out if you have evaluation expertise on your own staff, board or circle of strong supporters.  Tap local universities and colleges to get help with the tasks of compiling surveys, tabulating data and summarizing findings. “We have gotten some fabulous interns who are looking for a good experience,” Meisten said. “They are not necessarily in the music department. You might find students in the statistics department or the research and evaluation department."

VocalEssence partnered one year with an arts administration class at St. Olaf College to do an evaluation of its WITNESS program. “It’s great when the evaluation can be part of the curriculum for a class,” Meisten said. 

4. Develop your evaluation plan.

Once you have decided what you need to find out in your evaluation, consider how you will:

  • Collect the necessary information and from whom. There are an array of ways to get the information, from interviews, focus groups, observation, written surveys and online tools like Survey Monkey. 
  • Organize the information. 
  • Analyze the information.
  • Report the information to your board, funders and other relevant audiences. “What is often missed is sharing the evaluation results with the people you asked the questions of,” Meisten said. “It helps an organization to be trustworthy if it is transparent about the reasons and the data behind decisionmaking. Plus, people see that something came from their time spent answering questions. They come away feeling valued.”

5. Use multiple data-gathering strategies, including both quantitative and qualitative data.

The most effective evaluations blend quantifiable data with subjective, open-ended responses. Funders appreciate getting both kinds of information, too. “Subjective, open-ended questions provide you with the responses that warm the heart,” Meisten said. “It gets at their motivation for funding you. But with only subjective feedback it can be harder to reach conclusions. Having some quantitative data helps support or clarify those open-ended questions.”

When reporting responses from open-ended questions, you need not use people’s names—and should not without their permission. “Privacy is something to be aware of especially when working with schools,” Meisten said. “Simply attributing the quote to ‘a teacher’ or ‘a student’ is fine.” 

6. Remember that there is no perfect evaluation model. 

Evaluation is imperfect and often messy, and rarely 100 percent conclusive.  For example, VocalEssence’s first evaluation of its ¡Cantare! program got back answers that made it clear that some of the elementary student respondents had little English proficiency. 

“We should have translated the survey into Spanish,” Meisten said.  “We learned something from that. Now if we are surveying a school that is mostly Spanish speaking, we have a survey in Spanish.” 

Evaluation is not just for assessing a program when it is over. “You could do a survey before the program starts to get a sense of what people like and what your audience really wants,” Meisten said. “Or interview people while the program is going on—what we call formative evaluation—so that you can tweak things. That can be incredibly valuable.” 

VocalEssence used a formative evaluation as it was developing its ¡Cantare! educational outreach program. “We surveyed students and asked, ‘what are the three things you learned working with the composer and three things you would like to learn from the composer.’ It was super short and we learned that students wanted to know a lot more about why the composer wrote the song for the group. What was the inspiration? We told the composer that, and he was able to address it during the next residency."

Evaluating your education programs can have an impact on the programs themselves but also on the larger choral organization. “Our evaluation has helped our board and senior staff recognize how important education is to the future of the choral arts,” Meisten said. “Our strategic plan has an advocacy component now. Sharing the evaluation with leadership has impacted our direction.”

Resources

The American Evaluation Association offers an array of resources for organizations wanting to evaluate their programs.