Updated: Sep 7, 2018
Welcome back to Assessment Month on Carnegie Hacks! This month is all about getting your ready for those prickly tracking and assessment questions that appear all throughout the application.
This week we are talking about Results. You know, the data that spits out the other end of your Tracking and Assessment Mechanisms. (For more on Tracking & Assessment Mechanisms, see Carnegie Hack #30.)
Eyes on Results
1st timers will observe that many of the topics covered in assessment mechanisms questions appear together with 2 other questions a set, complete with a corresponding assessment results question and a uses of results question. Reclassifyers, on the other hand, will observe that many of these questions have been combined into a single item treating mechanism, result, and uses of results in one go.
FYI - to find questions about tracking results, search the application for the term "examples" and to find assessment results questions, search for the term "findings"
Or if you have the Getting Carnegie Classified Get on Track Package, open your Carnegie Roadmap to Appendix B "Assessment Resources."
The Dreaded Impact Questions
Of all the questions addressing numbers and findings, which are relatively few compared to the many questions about policies and practices, four questions loom menacingly over the heads of many applicants: impacts of students, faculty, community, and the institution itself.
But you're in luck! The 2020 application offers far more details about reviewer expectations than did previous cycles. Finally, the application guidance distinguishes outcomes and impacts for the entire set of questions, as well as the expectations at the constituent-level.
We'll be covering this in greater detail in July and August, but for now, what you need to take away is that it's not enough to simply collect data. The data you collect needs to be analyzed, so you can learn something from it.
We commonly conflate "assessment" with the activity of "data collection." But assessment isn't really assessment until we've learned from our data and used it to improve something.
At their core, these questions about results are asking what you learned. What do you know now that you didn't know before you collected the data? Think in terms of quality and of changes the community engagement contributed to. Finally, what insights did your data reveal that will help the reviewers understand community engagement on your campus?
In July and August, Carnegie Hacks will dive deeper into the Impact questions. Don't miss it.