Campus Climate Survey on Sexual Assault

2015-03-14_OhNoLogo22-mark3In Case You Missed It:

This week, the University of Michigan released the results of its 2015 campus climate survey regarding sexual assault.

Message from the President
Complete Survey Results

The University sent the survey out to a sample of 3,000 students. 67% of the students responded to the survey. 75% of the sample living on campus completed the survey.


Visualization Mapping

2015-03-14_OhNoLogo22-abby3I’m on an Edward Tufte kick – you’ll see this throughout the summer from me. I went to Edward Tufte’s one day course a month ago (I HIGHLY recommend it for anyone interested in doing data visualizations). One newer thing that Tufte showcased was an app he designed with Adam Schwartz called ImageQuilts (IQ). It’s an app that works with Chrome. (After you download the app) When you Google Image Search something you can click the “IQ” button next to your browser IQ app button and have it make a collage (“quilt”) of the images. Very cool. You can change each image’s size, pick what goes into the collage, use grayscale, etc. I’ve used it to quickly map ideas for how to best visualize data about student learning, or how to best use Excel 2013, or to make a visual list of the best task apps (it’s a summer goal). Instead of having a written list of ideas, I now have a visual list of ideas. Not to mention, IQ makes great quick images for presentations. IQ has been helpful in my quest to do better and more compelling visualizations with my data.

Here’s *one I made of my summer travels (can you tell where I’m headed this summer???? I’ll give you a hint: there are 5 locations total):

Abby travels image

You can download ImageQuilts in the Chrome Web Store. It’s free! There are easy instructions on the ImageQuilt website (as well as some examples). So go ahead you assessment geeks and play around with IQ and see how it helps you. I’d love to see what you make!

*Quick disclaimer: I created none of these images. They were gathered by Google Image Search. I take no credit for any individual image.

Assessing Advisor Conversations

2015-03-14_OhNoLogo22-mark3A few weeks ago I was eating in my favorite Ann Arbor pizza joint (New York Pizza Depot), when a friend of mine walked by the front window. Elaine peered in and decided to have some pizza herself. We started chatting about work and, despite trying to avoid the details, I got talking about assessment.

As I finished my elevator pitch of what we do for assessment, she says “ugh, so you must all be evaluated based on the results of the assessment?” The feeling I had must have been how Jimi Hendrix felt when he realized that, as a lefty, he could just string a righty guitar upside down. Until that point I had shied away from connecting our assessment effort with individual advisors under the idea that our assessment results could not accurately reflect on the efforts of an individual advisor. But in that moment, I simply could not muster any justification to avoid separating certain assessment results by advisor.

Pausing for a moment, I don’t want to get into a discussion on appropriate evaluation advisors. My short response to that is; no, advisor performance should not solely be measured by a few surveys with which we’re ecstatic if we receive a 30% return rate. The evaluation ought balance a handful of factors (and perhaps use more reliable data).

It hit me that we’re using our assessment to inform our delivery of information. Are students unsure what the Bulletin is? Let’s discuss it at orientation a bit more, or perhaps we’ll refer to it when we’re in our advising appointments or responding to e-mail. But there’s more to higher education than supplying information. Much of the value of advising comes out of our conversations with students. Since we each have our own style for advising, wouldn’t it be helpful to know if Steve’s students are getting involved at a higher rate, or if they’re more likely to find a position in the first few months after graduation? And isn’t that a great opportunity for us to have a conversation with Steve about how he approaches his student conversations, allowing us to reflect on our own practice?

My sense is that many of us, on some level, are afraid that our conversations with students are not always helpful and certainly not transformative. Helping a student sculpt a meaningful path for the near and far future is not at all like tutoring him for a math exam. It’s not uncommon for a student to leave my office with me thinking “Did I help that student at all?” Couple that insecurity with the fact that the outcomes we shoot for are often complicated and you’ve created an environment pushing assessment to the periphery.

If you’re reading this post (especially if you made it this far), you probably believe higher education is more than a set of information loaded into the minds of our students — that higher education can help individuals see themselves and the world in a more complex way. Central to this transformation are the conversations students have with faculty and staff. Though difficult to assess, it’s important that we improve the effectiveness of those conversations. While the assessment might not spit out a quantitative number of our conversation quality, it does give us a good opportunity to reflect and re-tool.

Data Viz

It’s FRIDAY! And it’s SUMMER! Which is a winning combination. Enjoy this excellent short video from PBS about data visualizations:

Summer Schedule – We hope you’re all enjoying your summer. I’m jet setting left-and-right, and Mark is orienting new wolverines – FUN! So for the summer season, we’ll be posting once a week. We’ll get back in to the swing of more frequent postings in August. Have a great weekend! See you next week!

Assessment in the Sun: Summer To Do List

2015-03-14_OhNoLogo22-abby3It’s summer and while I’m excited  to relax, I’m also looking forward to CRUSHING my summer office to-do list. There are many tangible items I need to complete (like hanging these 2 pictures that’ve been on my desk for 2 years – bah!), but with more consecutive hours of uninterpreted think time in this season, I’m focusing on those to-do’s that are less task-y and more nebulous. Most of the listed items center around the theme of “get caught up on thinking” – reflecting on our programs and learning goals and what is needed to make these more effective. So here I am, declaring to the world (well…er…the tiny portion of the world who reads this blog) my non-S.M.A.R.T. (yet smart?) assessment goals for the summer season. Maybe some of them will be similar to or complement assessment summer goals of your own (or not!):

  1. Check in with the office’s overall purpose and place in the institutional goals (read: are we on the right direction?)
    1. Review the College’s strategic plan (for the billionth time, but still, I see some part of it differently every time)
    2. Review the office mission statement (and fully memorize it…right now I know it like people know the song Sweet Caroline – I am confident and loud during the favorite parts and then mumble the rest)
    3. Review office 2014-2015 goals
  2. Channel the greats of compelling visual storytelling
    1. Edward Tufte
      1. “Loot” (as ET would say) through Beautiful EvidenceBea Evi coverand The Visual Display of Quantitative InformationVis Display cover
    2. David McCandless
      1. Watch his TED Talk The beauty of data visualization 
    3. Maria Popova
      1. Brain Pickings
    4. The New York Times
  3. Learn a lot of needed stuff (I warned you that these wouldn’t be S.M.A.R.T.!)
    1. Check out my playlist for the watch list
  4. Figure out the impossible secret solution to how to effectively simultaneously manage (read: juggle like crazy) the dream and the details…the big picture and the little picture…the yearly and the daily…whatever you call it
    1. Commit to relationship with ONE task app (currently I’m courting about 4 – successfully but inefficiently)
    2. Which task apps do you all like? Trello? DropTask? Google Tasks?
  5. Hang those two damn pictures on my desk! Arggg!! (Ok, not an assessment goal, but important nonetheless…)

What’s on your summer assessment to-do list?

High Impact Practices: Resources

2015-03-14_OhNoLogo22-abby3Hands-on learning, experiential education, engaged learning, whatever you may call it, student affairs professionals can agree that creating an environment in which students test, reflect upon, and reapply their learning will result in better outcomes (read: more bang for your higher education buck). We know this anecdotally but the High Impact Practices (HIP) research out there provides the data to support the level of engagement HIP have on the collegiate experience as well as gives professionals ideas and steps for how to enact all of this goodness (or more likely maximize what you already have). What is clear in all of the research is that the next level of this engaged learning is not the mere existence of experiential education, but rather that students have multiple opportunities to engage in high impact learning and that we properly assess these efforts and students’ level of learning.

Provided today at Oh no are resources for you to dive in more…

According to the George Kuh via NSSE, high impact practices:

  • demand considerable time and effort,
  • facilitate learning outside of the classroom,
  • require meaningful interactions with faculty and students,
  • encourage collaboration with diverse others, and
  • provide frequent and substantive feedback

Below are the most widely held examples for HIPs from AAC&U:

HIP_tables (1)-page-001

On the NSSE website, you can build your own report with the data they’ve collected in 2013 and 2014 – so fun!! Give it a try and sift through it to review the fun findings. Have I mentioned FUN!

Ashley Finley (on behalf of the AAC&U) provides some brief (though important) thoughts on proper execution of HIPs:

Other Videos to Watch (or more likely, just listen to in the background while you work on something else and occasionally look at):

  • George Kuh presentation about HIPs:

  • Ashley Finley’s plenary presentation about integrative learning:

What high impact practices are you working within? Where have you found success?

Friday: Family Income and College Chances

2015-03-14_OhNoLogo22-mark3It’s Friday!

I hope this starts a trend. Last week, the New York Times asked readers to draw a line predicting how likely a student is to attend college versus their income. FASCINATING.

My line was okay, but before I give away how I did, I want to give you a chance to try it out.

Go ahead, click here.

I love this because it challenges you to really lay out what you think is going on, and then it tells you how you did. No more of the easy “I don’t know what I think… why don’t you tell me what’s right” way of interacting with graphs.


This segment is just filler so you don’t see the results on accident.

I suppose it’s a good place for a joke:
“I haven’t slept for 10 days… because that would be too long.” -Mitch Hedberg

I don’t even feel bad about that joke.

Here’s how I did:

forblogGood news, I’m not the worst!

Though I’m amazed at how straight that line is.

How did you do?

In Case You Missed It: Title IX and Northwestern University

2015-03-14_OhNoLogo22-mark3Back in February, Northwestern University professor, Laura Kipnis, wrote an article titled Sexual Paranoia Strikes Academe.

The article touches on a few sensitive topics — revolving around the impact that the current environment has on professors and their students. Sensitive enough that I’m going to let the article speak for itself.

Shortly thereafter, she received a notice from the university that she was under investigation for two Title IX complaints resulting from the article. She wrote about this experience in My Title IX Inquisition. Which is currently available on the Chronicle of Higher Education’s website, though it looks like it may soon be locked as premium content.

If you’ve been following this at all, you’ve also seen that she has been cleared from these complaints.

Seeing this, I couldn’t help but remember my post two weeks about on the Kennesaw State Advisor Incident. As Bob Dylan would say, the times they are a changin’. Ten years ago, the incident at KSU would not have made it to the internet. Without the video, that story loses much of it’s impact. Before 2011’s Dear Colleague letter, Kipnis’ article would not have prompted in investigation.

As someone who believes a core purpose of college is to challenge ideas, it’s important to me that professors feel comfortable voicing their thoughts and opinions — and if they choose, exploring them in a public forum.

It is also important to me that we as higher education professionals (and as a country) do better when it comes to sexual misconduct.

I think Northwestern dropped the ball on this one. Kipnis wrote an article that may offend some readers. Upon reading the article, a few students at that university felt attacked, and they (rightfully) submitted a complaint. As lack of reporting is a common issue with sexual misconduct, it’s a good thing that these individuals felt comfortable submitting the complaint. But then, according to Kipnis’ article, the university decided it was ill equipped to investigate this case — essentially freeing themselves of the consequences of the decision.

Anything related to sexual misconduct seems to get a lot of press and is a topic that should be taken seriously. My concern here, as other sexual misconduct cases come out, universities will choose to hire outside investigators as a means of mitigating the risk of damaging their image. This effectively works around the whole purpose of the Title IX investigator (a role required of all institutions receiving federal aid — a stand alone position at larger institutions), creating an under-utilized administrator at a time when administration bloat is a concern among law makers, the people who get to decide how — and the extent to which — higher education is funded.

As colleges and universities are held responsible for more, I sense that we’re in for situations where institutional values are put in tension of one another. In this case, it’s the want to fulfill their Title IX obligations with the urge to supporting academic freedom for professors.

Is Your Assessment Any Good?

2015-03-14_OhNoLogo22-mark3This Friday, I had my annual performance review. When discussing my goals for the coming year, I mentioned that I’d like to change the way my we measure my performance. Over the five years I’ve been with the Engineering Advising Center, I’ve pulled in a number responsibilities. And that felt good. I felt important seeing an inbox with e-mails from across the college and university, all pertaining to different hats I wear in my role.

But who cares if you can juggle six tennis balls when the guy next to you is juggling three chainsaws! What I mean is, I’m not doing anyone (my students or myself) any favors by trying to pull in as many responsibilities as possible. This year, I want to be measured by the quality of my two largest commitments — assessment and our Peer Advisor program. When I mentioned this to my supervisor, he asked me how we would know if our assessment is any good. What a good idea for a blog post.

So how do you know if your assessment is any good? My initial thoughts start to resemble the movie Inception.


I hope that image was from Inception and not one of the Batman movies.

How do you know if your assessment is any good? Here are a few thoughts:

Does it measure outcomes you care about?
Does it accurately measure those outcomes?
Do you ever share the results with others?
Does the data collected inform changes to office processes?

But that list feels a bit empty. You could chase after that second criteria forever — will any set of questions completely capture some of the more complicated learning outcomes? Are there other criteria not critical, yet still of value? For example:

Do students care about the results?
Could the assessment be completed by someone else? (the ol’ “if you get hit by a bus” situation)
Do you ask high-quality, non-leading questions?
Can your assessment plan be explained in only a few minutes?

If you’re not careful, you can create a monster (as I did with the first assessment plan I designed). Recently, I’ve focused on assessment that captures the learning outcomes with as little excess as possible. In a quote often attributed to Einstein “…everything should be made as simple as possible, but not one bit simpler.”

Thus far, I’ve been writing mostly about assessment at the office level. What if we take a step back and look at assessment at the college or institutional level? How powerful would assessment be if office-level assessment were just a part of a larger institutional effort? At the office level, we’re limited to fairly simple learning outcomes; mostly because we tend to have limited interactions with students. But as an institution, we have a great impact. Our students should be learning and growing in ways not captured by simply adding up the assessment efforts of individual offices. Shouldn’t we capture that impact? Shouldn’t this effort include more than simply employment data?

At the institutional level, this hints at the importance of assessment informed by institutional mission. Shouldn’t we try to capture the extent to which we’re meeting our mission?

I may have posed more questions in this post than answers. What attributes of good assessment have I missed?