Where do I begin?

 

2015-03-14_OhNoLogo22-mark3It starts when you notice the local construction projects winding down. Then you’re cut off by a Ford Escape with New Jersey plates and a back full of clothes, books, a pink rug, and one of those chairs made solely of bungee cords. That’s right, it’s back to school season.

Abby and I met to for a pre-season re-vamp of Oh No and you can look forward to two posts a week this year — Mondays and Thursdays. We’re trimming a bit because those Friday posts came upon us awful fast and we want to keep this thing valuable. So without further adieu…

I met with a colleague from across campus this week. She works for a newer (and smaller) office just starting to wrap its mind around how to capture its value to the students it serves. The office focuses on developing an entrepreneurial mindset in our students and supporting student ideas from conceptualization to implementation. To further complicate its assessment process, the office is not yet on permanent funding, thus is under pressure to justify its existence.

I’ve already covered starting over in my Zero to Assessment post, however this conversation yielded a few new questions I wanted to chew on a bit.

What if I don’t know what students are learning from this experience? In the old, dusty textbooks of assessment you’ll find a flow chart looking something like this…

Presentation1

oh, well hello smart art…

This flow chart is helpful if you have clear and measurable learning outcomes, but leaves out instructions for when your outcomes are a bit cloudy. My colleague proposed measuring this through a series of qualitative questions — which, despite my aversion to the labor intensive nature of properly analyzing qualitative questions, seemed appropriate given the situation. And you know what, old dusty textbook I made up to illustrate my point, if an office centered around innovation can’t build a plane while they’re flying it, can any office? That is, if we can’t get an initiative started until we have every detail (e.g., assessment) ironed out, we’ll be missing out on a good number of valuable initiatives.

While I’m complaining about the rigidity of fictitious textbooks, it’s worth acknowledging that neither her nor I was all too sure of how she would analyze the data she’s collecting. It would be great if she had the labor to code each response, but that doesn’t seem likely. I think this is okay. It takes a few cycles to get an assessment process ironed out. Even by simply reading through the responses, she’ll get a feel of what her students are learning and how to better support them.

How do I get students to reply to my surveys? If I ever figure this out, I’m leaving the profession of higher education to hang out with the inventor of stick it notes on an island covered in boats, flat screen TVs, and Tesla convertibles. And, I guess, charging stations for the convertibles.

Very few people know how to do this well, however I’ve come across a few strategies which seem to be working.

-Make it personal. More than half of my job is forming relationships with students. Surveys are one of the times I leverage those relationships. I’ll often send the survey link in an e-mail noting (very briefly) the importance of the data collected in this survey, letting them know that every response is a favor to me (for all of the time I spend e-mailing them with answers to questions I’ve already answered in previous e-mails, this is the least they could do). If you’re sending a survey out to thousands, you can expect a very low return rate.

-Get ‘em while they’re captive. Do you have advising meetings with students at the start of these programs? is there an application to get into your program? Can you (easily) tie in the survey as a requirement for completing the program? I don’t mean to hint that surveys are the only means of collecting assessment data — but they’re direct, effective, and tend to be less labor intensive than other means.

Countdown to College football: 3 DAYS!

Zero to Assessment

2015-03-14_OhNoLogo22-mark3As you know, it’s Make Assessment Easy Month here at Oh No. In the Engineering Advising Center, we recently (last year) re-vamped our office assessment(s), and I’ve learned oodles in the process. Whether you’re creating an office-wide strategy, or a strategy to measure the success of a specific program owned by your office, these four steps  (which I picked up from Nacada’s 2014 Assessment Institute) can help you get from nothing to a simple, focused, and effective strategy. Most of the links to which I’m referencing come from NACADA, though the concepts are applicable to more than just advising.

Step 1, Create Learning Outcomes: NACADA recommends that learning outcomes focus on what we want students to know, do, and value (see last paragraph in Concept of Academic Advising). It’s good to keep this list short. We have 8 outcomes we focus on in our office. The longer your list, the longer (and more boring) your report of results. If your colleagues fall asleep while you’re discussing the results, you may have too many outcomes.

Step 2, Opportunities for Students to Achieve Outcome: It’s good to have a plan for when (e.g., workshops, advising appointments, etc.) we want students to achieve our desired outcomes. This portion might include workshops, advising appointments, tutorials, etcetera. In most cases, this is what you’re already doing! Hopefully.

Step 3, By What Time Should Learning Occur? This step helps you indicate when you’d like students to achieve your outcomes. For example, if you’re a career services office and you want students to have created a resume, you probably want that to happen sometime before they’re job searching. We often use student academic years/terms for this. For the resume example, your deadline might be by the end of their first year*.

*Originally I put “junior year” here. Abby’s response gave me the sense that career services folks would riot in the streets if this didn’t happen until the junior year. My sincere apologies! Feel free to pretend this deadline is anytime you see fit…

Step 4, How Will You Know if the Outcome Has Been Met? We use this step to determine when we’re going to make a measurement. It helps to limit yourself to just a few surveys or queries a year — this keeps your process sustainable. Common times to collect data are at the end of orientation, fall, and spring term.

In the end, you will have a table, with the learning outcomes as rows and each step as a column.

Untitled

This system works whether you’re creating an assessment for the entire office or if you’re just trying to assess one program. I’m using this process to assess our training and development of our orientation leaders this summer.

I hope you found this table useful. As you start to dive into the process of creating an assessment, you will come across questions that the table does not address (e.g., should we use surveys or focus groups or some combination of the two? Is our data valid? etc.). Just remember the KISS rule of thumb: Keep It Simple Steve. You may want to replace “Steve” with your name. The assessment does not have to be perfect. It should be simple enough for you (or someone else) to explain and follow through.