Writing Assessment Questions: Keep It Simple

2015-03-14_OhNoLogo22-abby3The month of May at Oh no is focusing on making assessment easy. I mean, that’s always our focus but we’re really honing in this month. Today I want to bring all of Mark and I’s theoretical musings on assessment and its purpose down to some tangible basics.

In my office (Career Center! woot! woot! ) we have nine career learning outcomes toward which we hope, in working with us, students make significant learning progress by the end of their four years at Carleton. And our purpose for having these learning outcomes is fourfold:

[1] to be transparent with students (and families and the College community too) about what students will learn/be able to do by interacting with the Career Center, so that students can be partners with us in driving their learning,

[2] to hold ourselves accountable to offer programs and services that help students progress in this learning, being intentional to make sure our programs and services serve the purpose of helping students learn,

[3] to hold students accountable to be the drivers of their learning and career development because I can’t get the job for you and more than that, I can’t decide for you which career is going to help you pursue your meaningful life, and

[4] to show the value/impact of working with the Career Center.

Here’s a sampling of some of the learning outcomes:

  1. Understand how to work through a process of self-assessment and be able to identify their transferable and work-related skills.
  2. Learn about a wide variety of career fields and opportunities.
  3. Be able to market themselves through written communication to prospective employers and networks.

So if you’ve written some learning outcomes (hooray!) then at this point, how do you write your assessment questions to assess if the students learned them or not???

An easy way is…brace yourself…to just ask them if they learned it. Revolutionary, I know. Well worth reading this blog, huh? ;-P Here’s what I mean. If my learning outcomes is:

Be able to market themselves through written communication to prospective employers and networks.

Then ask:

Indicate how much you [i.e., the student] agree or disagree with the following statements about the [insert program/service].

After attending the [insert program/service], I am able to market myself through oral communication to prospective employers and networks.

Strongly Agree | Agree | Neither Agree nor Disagree | Disagree | Strongly Disagree

Done!

Now, some of you assessment scholars (who I’m sure have nothing better to do than read this blog full of Sailor Moon gifs and Mark’s jokes ;-P), might say that using solely this kind of assessment is too shallow and wouldn’t hold up in a journal article. And to that I say, this would be only one part of a more rigorous methodology for publishing an article. BUT, I’m guessing most of you aren’t reading Oh no because you’re trying to publish, but rather to better your current practice/office initiatives. And as we’ve mentioned before, there is value is using student-reported data (especially when it’s benchmarked in one year and then measure again the next). Assessing your services will not only keep your initiatives in line with the mission of your office/institution and the learning environment, it’ll also give greater purpose and structure to the work you do.

There is so much more to say on this topic, but for now, I wanted to give a very specific, practical, simple, “you could implement this minute” tip. What are some tips you might have for keeping assessment questions simple?

Advertisements

Assessing FoMO: the Value of Self-Reported Data

2015-03-14_OhNoLogo22-abby3Fear of Missing Out (FoMO) is a term any millennial knows. That deep, almost painful feeling of wanting to be where all the fun is being had. Where is that magical place? It’s wherever you’re not. I hear this in student conversations all the time, usually in relation to evening and weekend activities. When did “Jealous” become an appropriate response to your friend regaling tales of Saturday night?? Since FoMO.

Since Fear of Missing Out has such a strong hold on so many people college-age, maybe FoMO can provide some glimpses into student motivation. In student affairs, understanding topics like student motivation, development, leadership, and other similar concepts is important and impactful to our work. But how do you measure it? Capturing these concepts is messy and (mostly) requires data that comes from students own perceptions of themselves (i.e., self-reported). Numerous data people in my life look down upon self-reported data saying, “How do we know (indefinitely and without flaw) that what a student reports is ‘accurate data’?” First of all, to that I say self-reported information IS data; and second,I think this fosters a different kind of fear in student affairs professionals. We fear that we cannot “do data” because often much of it is self-reported. In order to quell the fear that some of you may be feeling, I want to look at one example of a study done using self-reported data.

A few researchers,Andrew K. Przybylski, Kou Murayama, Cody R. DeHaan, and Valerie Gladwell (2013), studied the FoMO phenomenon. I’m no statistician, and I imagine you may not be either, so I suggest you start reading at the summary (pp. 1846-1847). But in case you’re not going to read it at all, in a rough nutshell: the researchers made a FoMO scale (wow, how can I take THAT quiz??) and used it with three groups of people in three studies, which showed results you probably would have guessed (e.g., people experiencing higher levels of FoMO tend to be younger, have less satisfaction in their own abilities and connections with others, and engage more frequently with Facebook), but doesn’t it feel good when your thoughts are backed by data?! :::le sigh:::

Now that you’re interested in the study (hopefully?), jog back to the methods and measures portions of the article, and you’ll find that these researchers tapped into something many in student affairs have been wondering: how do I take all the stuff I know to be true about student motivation, development, and learning out of the classroom and test it? And then how do I capture those findings in a format that has that data-strength-je-ne-sais-quoi that can give out of the classroom learning the credibility it deserves?

I know there were some scary words in this study like “regression model” and “confidence interval”, but, if you push through that and look at the bigger picture, I hope you’ll delight in that the researchers used participant self-reported data to show impact (you know, similar to when we ask students if they learned anything about themselves at our programs).

So, hooray! Ask your students to report their learning back to you in some sort of systematic way, and work the results into your next staff meeting. It doesn’t have to be hard; you don’t want to miss out, right?

How do you feel about asking students to self-report their learning, development, etc.? Have you had the same experience with self-reported student data?

Thanks to Computers in Human Behavior and Andrew K. Przybylski, Kou Murayama, Cody R. DeHaan, and Valerie Gladwell (2013) for the great article, and to College Humor for the great video.