Guest Blogger: When Assessment and Data Are Too Much of a Good Thing

Shamika Karikari photo headshot

I love coffee. Strong and black. Coffee does something to my soul when I drink it. The smell of roasted coffee beans and steam coming from my favorite mug brings a smile to my face. Beginning my morning with a hot cup of coffee is sure to set a positive tone for the rest of the day. I love coffee. Then something happens. It’s 3 o’clock and I realize I’m on my fourth cup of the day. During this realization, I begin to notice my higher heart rate, funny feeling in my stomach, and that I’m a bit more energized than what is good for me. The truth is I had too much of a good thing.

 

coffee mug with quote about courage

What is your “too much of a good thing”? I’m convinced we all have it, whether we want to admit it or not. Nowadays it seems assessment and data has become one of higher education’s good things that we have too much of. I want to be clear; assessment and data are necessary in higher education. Both assessment and data are good, just like coffee. However, when we have too much of it and do not use it effectively, this good thing turns into something bad. I see this most often show up in three ways that I will describe below.

  • Quality over quantity. Assess more and have more data has been the message given to many higher education professionals. More isn’t inherently bad, but it also isn’t always necessary. When we expect professionals to assess more, are we equipping them with the tools to build effective assessment tools? Are we being thoughtful about targeting what we assess instead of assessing everything? Do we consider survey fatigue? We must consider these questions. Creating fewer effective assessment tools that provide rich data instead of conceding to the pressure to assess everything will serve professionals well. Switching the focus to quality over quantity is a shift higher education must consider.
  • Dust filled data. When we leave something in a corner and don’t attend to it dust will collect. The same happens with data in higher education. When we conduct multiple assessments we have data that is filled with dust because we do not do anything with it. Because most of our data is stored electronically we don’t see the dust, but it’s there. It’s not enough to say we did an assessment. We must go a step further and use the data! We must analyze the information we’ve collected, share it with folks who need to know, and adapt a plan for how the data will be used. When we do this, our assessment becomes purposeful. When we do this, our investment in that specific assessment is justified. When we do this, our colleagues and students are best served. What timeline can you set yourself to avoid dust getting on your data? What data currently needs dusting off?
  • Over our heads. Some higher education professionals have done a great job assessing in effective ways and utilizing the data collected. However, the dissemination of data is over our heads. The pressure professionals feel has turned into the need to create 30-page analysis of data. What happened to one-page summaries? When will we use technology to disseminate our data? How can we make the analysis and presentation of the data interesting, so people want to read and use it? These are all questions we should be asking when considering the dissemination of data. I have found infographics to be a highly effective way to disseminate information in an accessible way. Making changes to better share our stories is beneficial and necessary.

Assessment is a good thing. Making data driven decisions is a good thing. We know this to be true. To ensure it doesn’t continue as too much of a good thing, professionals must consider the implications of the current way we do assessment in higher education. The survey fatigue students experience, the pressure to have data when making any size decisions, and the expectation that we assess everything under the sun have clouded the goodness of assessment. How are you doing with quality over quantity? What data needs dusted off in your office? How can you make data accessible to all? Considering these questions will get you one-step closer to keeping assessment good. Because remember, like my fourth cup of coffee in the afternoon, you want to steer clear of having too much of a good thing.

Mika Karikari is currently a doctoral student in Student Affairs in Higher Education at Miami University as well as an Associate in Career Services. Additionally, her professional background also includes academic support, residence life, and new student programs. You can follow Mika on Twitter @MikaKarikari or email her at johns263@miamioh.edu.
  

Sharing Data Effectively

2015-03-14_OhNoLogo22-mark3One of the challenges we assessment-ites have is what data to share and how to share it. When sharing data, you want it to be both interesting and appropriate to the intended audience. For data to have impact, it must be interesting. But not all data should be shared. Because I don’t have a better word for it, I’ll call that the “appropriateness” of the data. If the data is detrimental to your mission, it may not be appropriate to share.

It all starts with your intended audience. Is the audience your director? the dean? students? Once you have the intended audience, it’s helpful to visualize with the table below:

InterestingAppropriateChartI created this chart from the perspective of the students; if your audience is the math department, Dr. A’s calculus class becomes more appropriate. Similarly, if I’m the intended audience, how many bagels am I eating each week? The idea is for all of your reported data to fall in the upper-right quadrant.

And now, more on the appropriateness of the data…. 

At our last advisor meeting in the fall term, we discussed a new tool available to our students. This tool, integrated into the registration system, gives students information about the courses for which they might enroll. The system tells them what past students often took after this class and the degrees they sought. It even shows them the grade distribution of the class over the past few years. I had the requisite student affairs knee jerk reaction: but do we want students to see grade data? Will they then avoid the “hard” classes and lean toward the “easy” ones? I put quotes around “hard” and “easy” because, you know, there are no easy or hard classes — every student’s experience is different.

After learning about the student interface, we were introduced to the staff interface. What we see has MUCH more information. The system allows us to drill down and look at specific groups of students (sophomores, engineering students only, underrepresented groups, etc.). It’s a powerful tool I found myself lost in for about 45 minutes that afternoon. It’s the Netflix of work; once opened, who knows how long you’ll be in there.

My thoughts bounced around like a ping pong ball in a box of mouse traps. From Students should not be able to see this! They’ll simply take the easier courses! To Students should have access to EVERYTHING! They need to learn to make data driven decisions! Then I started to settle down.

It’s good for us to share information with our students — especially information that interests them. They’ll make a ton of decisions in their lifetime and need to navigate the information that’s out there. Sure, some of them will choose the high average GPA classes, but would they have been better served if we stuck to the usual “Nope. I can’t give you any guidance on this. You need to surf the thousand-course course guide and find a class that interests you.“?

But some data shouldn’t be widely available. If you’re trying to increase the number of women in your institution’s math major, it might be counter productive to allow undeclared students to see “Oh, this major is 90% men. I don’t know if that’s a place I want to be.” It seems to me that kind of information sustains the imbalances we already have and are trying to mitigate.

To conclude…

It’s easy to get pulled into the “oh, can we add a question on ______ in the survey?” cycle. If you’re not careful, you end up with an oversized excel spreadsheet and a bored audience. When you feel the survey creep happening, get back to the questions of: Who is this for? Is this interesting to them? Is it appropriate for them?

Now go youtube other videos of ping pong balls and mousetraps.