Assessment Summer Retreat

2015-03-14_OhNoLogo22-abby3Hello assessment friends! The summer has been fun, but flying by. It kicked off with Mark’s Bachelor Party (in Niagara Falls), and then wedding – there were beautiful vows, dancing and pie; I was a groomsmaid; a rooster loudly crowed throughout the ceremony; and it was a happy day.

mark and kate

Congratulations Mark & Kate!

Then, I went to Disneyland and it was truly magical.

disney

And now it’s time to get back to work! 

What I hear from people all the time is that they’ve done so much good and diligent work throughout the year collecting program attendance numbers, and student feedback and surveys, but with no time (and maybe some analysis paralysis) to dig into what does all that delicious data say about what students learned.

If you’re like me, you work amongst thoughtful, passionate, student-centered professionals. And, summer in the office is THE time to reflect, work on projects, and retreat. So, bring all of these elements together – your student usage/engagement data/feedback, great colleagues, and summer reflection time – to tackle the question of how did all of our initiatives and efforts with students impact their learning?

Next week my colleagues and I will be retreating (well…not actually away from campus, but you get what I mean) to do just this: review our office goals and student engagement data, figure out what it all means, and strategize and vision for the coming year. Specifically, we’ll be focusing on:

  • What patterns you see in our student usage data?
  • What do the data and patterns say to you about how students engaged with our initiatives and services?
  • How did all of our initiatives and efforts with students impact their learning?
  • How does our student engagement data relate to our annual office goals?

Don’t be afraid to not have all the answers, and don’t be afraid to struggle as a group through these types of questions. This will help you make data- and assessment-driven decisions that ultimately help students (and, hopefully, it will be a little fun too!).

Happy retreating!

Advertisements

I collected data! Now what?!

Abby photoWe’re coming to the close of yet another academic year and you did it! You surveyed students or tracked who did (and didn’t!) visit your office or understood the student learning outcomes from a program or whatever we keep preaching about on this blog. But, now what???? If you read any assessment book, at this point there are common next steps that include things like “post-test” and “close the loop” and a bunch of other common (and good!) assessment wisdom. But sometimes that common assessment wisdom isn’t actually helping any of us professionals DO something with all this data. Here are a few things I do with my data before I do something with my data:

  1. Share the data in a staff meeting: Your colleagues may or may not be formally involved in the specific program you assessed but they work with the same students, so they’ll be able to make connections within student learning and to other programs/services that you’re missing. Ask them about the themes they’re seeing (or not seeing!) within the data. It’ll help you clarify the outcomes of your data, bring more people into the assessment efforts in your office (more heads are better than one!), and it’s a nice professional development exercise for the whole team. Teamwork makes the dream work!
  2. Talk to peer colleagues about their version of the same data: Take your data* to a conference, set up a phone date with a colleague at a peer school, or read other schools’ websites. Yes, you’ll likely run into several situations that aren’t directly applicable to yours, but listen for the bits that can inspire action within your own context.
  3. Take your data to the campus experts: Know anyone in Institutional Research? Or the head of a curriculum committee? Or others in these types of roles? These types of people work with the assessment process quite a bit. Perhaps take them to coffee, make a new friend, and get their take.
  4. Show your data* to student staff in your office: Your student staff understand the inner workings of your office AND the student experience, so they’re a perfect cross section of the perspective that will breathe life into the patterns in your data. What do they see? What data patterns would their peers find interesting? What does it mean to them?

WOW, can you tell I’m an extrovert?! All of my steps include talking. Hopefully these ideas will help you to not only see the stories of student learning and programmatic impact in your data, but also to make the connections needed to progress toward closing the loop.

* This goes without saying, but a reminder is always good; make sure to autonomize the data you show students and those outside of your office/school!

Guest Blogger: When Assessment and Data Are Too Much of a Good Thing

Shamika Karikari photo headshot

I love coffee. Strong and black. Coffee does something to my soul when I drink it. The smell of roasted coffee beans and steam coming from my favorite mug brings a smile to my face. Beginning my morning with a hot cup of coffee is sure to set a positive tone for the rest of the day. I love coffee. Then something happens. It’s 3 o’clock and I realize I’m on my fourth cup of the day. During this realization, I begin to notice my higher heart rate, funny feeling in my stomach, and that I’m a bit more energized than what is good for me. The truth is I had too much of a good thing.

 

coffee mug with quote about courage

What is your “too much of a good thing”? I’m convinced we all have it, whether we want to admit it or not. Nowadays it seems assessment and data has become one of higher education’s good things that we have too much of. I want to be clear; assessment and data are necessary in higher education. Both assessment and data are good, just like coffee. However, when we have too much of it and do not use it effectively, this good thing turns into something bad. I see this most often show up in three ways that I will describe below.

  • Quality over quantity. Assess more and have more data has been the message given to many higher education professionals. More isn’t inherently bad, but it also isn’t always necessary. When we expect professionals to assess more, are we equipping them with the tools to build effective assessment tools? Are we being thoughtful about targeting what we assess instead of assessing everything? Do we consider survey fatigue? We must consider these questions. Creating fewer effective assessment tools that provide rich data instead of conceding to the pressure to assess everything will serve professionals well. Switching the focus to quality over quantity is a shift higher education must consider.
  • Dust filled data. When we leave something in a corner and don’t attend to it dust will collect. The same happens with data in higher education. When we conduct multiple assessments we have data that is filled with dust because we do not do anything with it. Because most of our data is stored electronically we don’t see the dust, but it’s there. It’s not enough to say we did an assessment. We must go a step further and use the data! We must analyze the information we’ve collected, share it with folks who need to know, and adapt a plan for how the data will be used. When we do this, our assessment becomes purposeful. When we do this, our investment in that specific assessment is justified. When we do this, our colleagues and students are best served. What timeline can you set yourself to avoid dust getting on your data? What data currently needs dusting off?
  • Over our heads. Some higher education professionals have done a great job assessing in effective ways and utilizing the data collected. However, the dissemination of data is over our heads. The pressure professionals feel has turned into the need to create 30-page analysis of data. What happened to one-page summaries? When will we use technology to disseminate our data? How can we make the analysis and presentation of the data interesting, so people want to read and use it? These are all questions we should be asking when considering the dissemination of data. I have found infographics to be a highly effective way to disseminate information in an accessible way. Making changes to better share our stories is beneficial and necessary.

Assessment is a good thing. Making data driven decisions is a good thing. We know this to be true. To ensure it doesn’t continue as too much of a good thing, professionals must consider the implications of the current way we do assessment in higher education. The survey fatigue students experience, the pressure to have data when making any size decisions, and the expectation that we assess everything under the sun have clouded the goodness of assessment. How are you doing with quality over quantity? What data needs dusted off in your office? How can you make data accessible to all? Considering these questions will get you one-step closer to keeping assessment good. Because remember, like my fourth cup of coffee in the afternoon, you want to steer clear of having too much of a good thing.

Mika Karikari is currently a doctoral student in Student Affairs in Higher Education at Miami University as well as an Associate in Career Services. Additionally, her professional background also includes academic support, residence life, and new student programs. You can follow Mika on Twitter @MikaKarikari or email her at johns263@miamioh.edu.
  

Zero to Assessment

2015-03-14_OhNoLogo22-mark3As you know, it’s Make Assessment Easy Month here at Oh No. In the Engineering Advising Center, we recently (last year) re-vamped our office assessment(s), and I’ve learned oodles in the process. Whether you’re creating an office-wide strategy, or a strategy to measure the success of a specific program owned by your office, these four steps  (which I picked up from Nacada’s 2014 Assessment Institute) can help you get from nothing to a simple, focused, and effective strategy. Most of the links to which I’m referencing come from NACADA, though the concepts are applicable to more than just advising.

Step 1, Create Learning Outcomes: NACADA recommends that learning outcomes focus on what we want students to know, do, and value (see last paragraph in Concept of Academic Advising). It’s good to keep this list short. We have 8 outcomes we focus on in our office. The longer your list, the longer (and more boring) your report of results. If your colleagues fall asleep while you’re discussing the results, you may have too many outcomes.

Step 2, Opportunities for Students to Achieve Outcome: It’s good to have a plan for when (e.g., workshops, advising appointments, etc.) we want students to achieve our desired outcomes. This portion might include workshops, advising appointments, tutorials, etcetera. In most cases, this is what you’re already doing! Hopefully.

Step 3, By What Time Should Learning Occur? This step helps you indicate when you’d like students to achieve your outcomes. For example, if you’re a career services office and you want students to have created a resume, you probably want that to happen sometime before they’re job searching. We often use student academic years/terms for this. For the resume example, your deadline might be by the end of their first year*.

*Originally I put “junior year” here. Abby’s response gave me the sense that career services folks would riot in the streets if this didn’t happen until the junior year. My sincere apologies! Feel free to pretend this deadline is anytime you see fit…

Step 4, How Will You Know if the Outcome Has Been Met? We use this step to determine when we’re going to make a measurement. It helps to limit yourself to just a few surveys or queries a year — this keeps your process sustainable. Common times to collect data are at the end of orientation, fall, and spring term.

In the end, you will have a table, with the learning outcomes as rows and each step as a column.

Untitled

This system works whether you’re creating an assessment for the entire office or if you’re just trying to assess one program. I’m using this process to assess our training and development of our orientation leaders this summer.

I hope you found this table useful. As you start to dive into the process of creating an assessment, you will come across questions that the table does not address (e.g., should we use surveys or focus groups or some combination of the two? Is our data valid? etc.). Just remember the KISS rule of thumb: Keep It Simple Steve. You may want to replace “Steve” with your name. The assessment does not have to be perfect. It should be simple enough for you (or someone else) to explain and follow through.