The phrase “learning outcome” used to make Michelle Whipple’s skin crawl.
The interim senior associate director of programs at Purdue University said she has memories of submitting her assessment results minutes before they were due. But, an experience while teaching a Zumba class changed her view of assessment completely.
She received the comment, “It must be nice to exercise for a living” at the end of the class. Whipple said it ruffled her feathers as she did much more than dance, lift weights and count to eight. “For the next few days, I had a chip on my shoulder that had everything to do with the desire to demonstrate the fitness program I oversaw at the time enhanced the quality of life and educational experience of Purdue students, faculty and staff,” she said. “Since that day, I made it my mission to master the assessment of learning experiences, to demonstrate with evidence how out-of-class experiences contribute to the all-familiar student affairs phrase, ‘student success.’”
Whipple isn’t alone in her story of initially hating assessment. Katie White, the associate director of Programs and Assessment at Towson University, saw it at first as another tedious task to complete. Since then, she has learned the value of it, but that didn’t mean everyone else did. As such, when getting a team onboard with assessment, she offered several tips:
- It doesn’t happen overnight. Have patience and help them understand why it’s key in helping them improve, learn or try.
- Offer training and one-on-one guidance to staff as they are creating their annual assessment plans.
- Require decisions be made based on data versus a hunch or a feeling. This helps create a culture of assessment. For example, White said they were working to decrease forfeits. They plan to gather data from games played and on-the-spot questions to captains about the impact of the new rules. She noted Connect2Concepts has been an excellent data collection tool for things like this.
Andrew Chadick, the senior associate director of Programs and Assessment at the University of Texas at San Antonio, tries to simplify assessment as much as possible via resources and templates. But really, he said it starts with educating them on why it’s key and that they need to provide data, not just anecdotal evidence.
Whipple agreed making assessment simple, fun and valuable helps staff get on board. Plus, she created an assessment champions model that identifies 16 staff members who are willing to learn and take the lead on assessment projects in their area. They choose between a learning, operational or strategic outcome to focus on for a year.
There are several major educational components to the model: assessment workshops on things like how to write learning outcomes and surveys, how to conduct focus groups, and how to assess without a survey; one-on-one coaching sessions with each champion where she asks questions like, “What’s important to you this year?” or “What is a problem or challenge you have?” to help initially determine the outcome; and finally, a showcase at the end of the year where each champion presents their project in a TED Talk fashion.
“The assessment champion gets to show and tell the work they do, and the audience learns how each staff member contributes to the overall success of the department,” said Whipple. “Honestly it brings a few happy tears to my eyes.”
While staff buy-in can be the biggest hurdle to assessment taking root at your campus, there are a few more aspects to consider. For example, Chadick said the biggest challenge in assessment has been the response rate as it relates to participant satisfaction. One way they’ve overcome this is by incentivizing the response with something like a gift card drawing. However, it is still hit or miss.
For White, the solution to this is to use your student employees. She said they are a captive audience and drive nearly everything a rec center delivers. In fact, about 75% of Towson’s assessments are completed with student employees. “We still obtain plenty of the participant voice through usage/participation data, sales, interviews, focus groups and other means; it’s just not the primary data,” she said.
There are also a lot of tools to use for your assessment, namely other departments on campus. Whipple said you should get to know your technology and institutional data resources. Ask them your questions on how to optimize a new hardware or software for data assessment. She also said your marketing or integrated communications team should know how to take data and make it much more appetizing to your stakeholders.
Chadick also spent time establishing relationships with those in institutional research. While a con of working with other partners is slower response and delivery times due to them working on other projects, the pros outweigh the negatives. “It has taken some time, but we’re now in a place where they have a much better understanding of what we do and the amount of data we can share with them,” he said. “It has been a mutually beneficial relationship.”
In fact, deciding on what to assess needs to be tied to what lies outside your rec center and department. White explained you should always prioritize connecting your assessment to the university and presidential priorities, mirroring the language they use. For example, your student government doesn’t know what 150,000 square feet of rec space looks like. Use examples like “three football fields” to connect with them better.
Whipple also noted you should read your campus newsletter, or letters from the president or provost. “You need to know what they care about, so you are aligning the evidence you are collecting and reporting with things they care enough about to read,” she said.
Assessment can have a large impact. White said data has done everything from improving the way they train staff to sharing resources efficiently to educational campaigns at Towson. And on the other end, data is needed to buttress decisions. Whipple said this year they have transitioned to InnoSoft Fusion, added Connect2Concepts and brought in Les Mills fitness programming. “With all of these changes, there needs to be a demonstration of impact, and that impact better include numbers alongside testimonials and stories from our members,” she noted.
All in all, while most might still see a relationship with data as the love-hate type, it seems that with a focus on simplicity, value and fun, data can evolve to be more loved than hated.