The month of May at Oh no is focusing on making assessment easy. I mean, that’s always our focus but we’re really honing in this month. Today I want to bring all of Mark and I’s theoretical musings on assessment and its purpose down to some tangible basics.
In my office (Career Center! woot! woot! ) we have nine career learning outcomes toward which we hope, in working with us, students make significant learning progress by the end of their four years at Carleton. And our purpose for having these learning outcomes is fourfold:
 to be transparent with students (and families and the College community too) about what students will learn/be able to do by interacting with the Career Center, so that students can be partners with us in driving their learning,
 to hold ourselves accountable to offer programs and services that help students progress in this learning, being intentional to make sure our programs and services serve the purpose of helping students learn,
 to hold students accountable to be the drivers of their learning and career development because I can’t get the job for you and more than that, I can’t decide for you which career is going to help you pursue your meaningful life, and
 to show the value/impact of working with the Career Center.
Here’s a sampling of some of the learning outcomes:
- Understand how to work through a process of self-assessment and be able to identify their transferable and work-related skills.
- Learn about a wide variety of career fields and opportunities.
- Be able to market themselves through written communication to prospective employers and networks.
So if you’ve written some learning outcomes (hooray!) then at this point, how do you write your assessment questions to assess if the students learned them or not???
An easy way is…brace yourself…to just ask them if they learned it. Revolutionary, I know. Well worth reading this blog, huh? ;-P Here’s what I mean. If my learning outcomes is:
Be able to market themselves through written communication to prospective employers and networks.
Indicate how much you [i.e., the student] agree or disagree with the following statements about the [insert program/service].
After attending the [insert program/service], I am able to market myself through oral communication to prospective employers and networks.
Strongly Agree | Agree | Neither Agree nor Disagree | Disagree | Strongly Disagree
Now, some of you assessment scholars (who I’m sure have nothing better to do than read this blog full of Sailor Moon gifs and Mark’s jokes ;-P), might say that using solely this kind of assessment is too shallow and wouldn’t hold up in a journal article. And to that I say, this would be only one part of a more rigorous methodology for publishing an article. BUT, I’m guessing most of you aren’t reading Oh no because you’re trying to publish, but rather to better your current practice/office initiatives. And as we’ve mentioned before, there is value is using student-reported data (especially when it’s benchmarked in one year and then measure again the next). Assessing your services will not only keep your initiatives in line with the mission of your office/institution and the learning environment, it’ll also give greater purpose and structure to the work you do.
There is so much more to say on this topic, but for now, I wanted to give a very specific, practical, simple, “you could implement this minute” tip. What are some tips you might have for keeping assessment questions simple?