Assessment Pro Tips: NEEAN Conference

2015-03-14_OhNoLogo22-abby3Last week I was at the Northeast Educational Assessment Network (NEEAN) fall forum at College of the Holy Cross. Wow – what an excellent conference! The theme of the conference focused on the past, present, and future of assessment in higher education.

I co-presented with two incredible professional colleagues (see photo below): Carol Trosset (Associate Director of Institutional Research & Assessment, Carleton College) and Holly McCormack (Dean of Field Work Term, Bennington College) on assessing the liberal arts and its preparation for life after college via internships. Carol brought together the work she and Holly had been doing at Bennington with projects we’re in the midst of at Carleton to make this presentation. We had a such a great audience who brought insightful questions and ideas. Loved it!

NEEAN collage

The keynote speaker, Steve Weisler, gave an excellent presentation and concurrent session about assessment’s present and future. I took furious notes; here’s what stuck out to me:

  • Assessment means riding the bike while building it
  1. Treat student learning outcomes (SLO) as an inquiry question – assessment is a process of inquiry NOT a committee report
  2. Assessment and SLOs need TIME to show their real value, similar to discipline-specific research
  3. Reconcile the fact that assessment needs lots of time with the fact that we need to be presenting/showing progress now
  • Focus on making sure we have the appropriate learning goals because they will shape the conversation
  1. SLOs need to have variables that are sensitive to what truly differentiates a student at the beginning and end of college (e.g., Is “critical thinking” the appropriate measure? Or is it focusing progress on the wrong metric?)
  • Content cannot be the main measure of learning
  1. Students will forget so much of the information-specific content they acquire, thus we need to focus more on capturing the larger learning happening in its midst
  • Don’t let perfect be the enemy of good: Assessment needs to start somewhere
  1. Be practical on your start, and then as you implement your assessment plan re-examine if your goals and strategies are in alignment
  2. You want quality SLOs over quantity – start small and simple and then grow into it
  3. You won’t be able to start if you’re constantly problematizing your process

A big THANK YOU to my co-presenters Carol and Holly for a meaningful collaboration and presentation, and to NEEAN and Steve Weisler for such a hearty, learning-dense conference.

Podcast Recommendation: Show About Race

2015-03-14_OhNoLogo22-abby3

 

I love podcasts. My current favorite is Our National Conversation about Conversations about Race (a.k.a. “Show About Race”) with co-discussants Raquel Cepeda, Baratunde Thurston, and Tanner Colby. They describe their podcast as:

Authors Baratunde Thurston (How To Be Black), Raquel Cepeda (Bird Of Paradise: How I Became Latina) and Tanner Colby (Some Of My Best Friends Are Black) host a lively multiracial, interracial conversation about the ways we can’t talk, don’t talk, would rather not talk, but intermittently, fitfully, embarrassingly do talk about culture, identity, politics, power, and privilege in our pre-post-yet-still-very-racial America. This show is “About Race.”

show about race logoWHAT AN EXCELLENT PODCAST!

I so enjoy and appreciate this show – this is important stuff (holy understatement, Batman) and their conversations inform and challenge me in the way that all people (but especially me as a white person) should be about power, privilege, race, etc.This trio’s thoughtful and frank conversations keep these topics/issues/people’s lived experiences at the forefront of my thinking about collecting data in higher education, assessing learning, and meeting the needs of all students.

Show About Race also posts a response episode during the off-weeks called the B-side, on which they read listener feedback about the previous show as well as reflect on their conversation and clarify/expand on their comments. I like the B-side as much as the regular show because it feels like a rare opportunity to have a discussion, get feedback and time to reflect on it, and then come back and discuss it again (and in a public forum!). It also happens to cater to my enjoyment of talking…about talking (can you tell I’m an extrovert???).

Get to iTunes (or your favorite podcast app) and subscribe to Show About Race. My favorite episodes so far have been #009 about white fragility and #002 about many things and colorism. Cannot wait to hear more!

For the Love of Counting: The Response Rate Rat Race

2015-03-14_OhNoLogo22-abby3I’m in the midst of our annual summer experiences survey – my office’s push to understand what do students do over the summer? And is it meaningful? We know that getting ALL students to respond to our 3-12* question survey would be near impossible, but, as the assessment person on the team, it’s my job to always chase that dream (let’s be real, it’s my obsession to chase that dream!). And at a small institution like where I work getting a response rate of 100% (~1500 students) is seemingly an attainable goal. But this raises so many questions for me.

A little bit of context about the survey. Students do many valuable things over the summer that add meaning to their college experience; the particular subsection of this data that chiefly interests me (as I rep the Career Center) is the number of students who intern.

Common statistical wisdom would tell me that if I am indeed going to report on how many students intern over the summer then I need a certain response rate in order to make an accurate, broader statement about what percentage of Carleton students intern. This stats wisdom is based on a few factors: my population size, my sample size, and the margin of error with which I’m comfortable (I know, I know…ugh, statistic terms. Or maybe some of you are saying YAY! Statistic terms! Don’t let me stereotype you):

Population size = 1500 (all the upperclass students)

Sample size = 1275 (well…this is the goal…which is an 85% response rate…but do I need this # to be accurate and broad?? Hmm…better look at what margin of error I’m comfortable with…)

Margin of error = um…no error??? Baaaahhhh statistics! Sorry readers, I’m not a stats maven. But that’s ok, because SurveyMonkey greatly helped me to determine this

Margin of Error

Ok, so if I want to be SUPER confident (99%) then my goal of 1,275 students (or an 85% response rate) will get me a VERY small margin of error (read: this is good). But, turns out if I look at this from the angle of sample size, I could have the same small margin of error if I only had 1,103 students respond (74% response rate).

Sample Size

So, at this point, I could ask: Why the heck am I busting my butt to get those extra 11% of respondents??? YARG! And statistically, that would be a valid question.

But I don’t ask that question. I know I chase the 85% and 100% response rate dream because I aim to serve ALL students. And even if statistically all the students after 1,103 respond consistently, there is likely an outlier…one or a few student stories that tell me something that the first 1,103 couldn’t that shape a better student experience for all.

So to all of you regardless of if you have a relatively small population size (like me) or a much larger one (hint, Mark, Michigan Engineering, hint), I say keep up the good work trying to reach and understand the stories of 100% of your students. It may be an impossible dream but that doesn’t make it any less worthy a pursuit.

*3-12 question survey based on what the student did over the summer - skip logic, woot woot!

Assessment on the Go!

2015-03-14_OhNoLogo22-abby3Summer is busy! Between attending conferences, catching up on planning items, and (hopefully) a little R&R, people are here, there, and everywhere. What’s a data head to do?

As you know, my summer goals include all things data visualizations. In my search for learning and inspiration, I stumbled across the podcast Data Stories. With my 32-minute commute to work everyday, listening to hosts Moritz and Enrico on Data Stories is perfect. I first got hooked on an episode with Miriah Meyer about exploratory data viz tools. Imagine talking about data visualizations, data tools, and methods with your friends – that’s Data Stories. Was Miriah’s research waaaaay beyond me?? Yes. BUT it was so enjoyable to listen their banter AND it gave me lot of ideas, so I was drawn back to DS for more.

data-stories_original

Here’s why I like it:

  • Data + friendship = my favorite
  • Cataloged points in each episode, so you can pick and choose the parts of an individual episode you want to listen to – just visit their website. Some of the episode topics are over my head, so this is a useful tool!
  • Learn, get inspired, and brainstorm ideas all while on your way to work
  • Resources galore – they always provide related and discussed links from each episode on their website – very handy!
  • You can subscribe (read: automatic updates! no thinking invovled!)

Go ahead, peruse the Data Stories archives, get onto your favorite podcast app, and binge! Next in my queue are about data art and data journalism. The perfect complement to a summer on the go.

See you next week!

High Impact Practices: Resources

2015-03-14_OhNoLogo22-abby3Hands-on learning, experiential education, engaged learning, whatever you may call it, student affairs professionals can agree that creating an environment in which students test, reflect upon, and reapply their learning will result in better outcomes (read: more bang for your higher education buck). We know this anecdotally but the High Impact Practices (HIP) research out there provides the data to support the level of engagement HIP have on the collegiate experience as well as gives professionals ideas and steps for how to enact all of this goodness (or more likely maximize what you already have). What is clear in all of the research is that the next level of this engaged learning is not the mere existence of experiential education, but rather that students have multiple opportunities to engage in high impact learning and that we properly assess these efforts and students’ level of learning.

Provided today at Oh no are resources for you to dive in more…

According to the George Kuh via NSSE, high impact practices:

  • demand considerable time and effort,
  • facilitate learning outside of the classroom,
  • require meaningful interactions with faculty and students,
  • encourage collaboration with diverse others, and
  • provide frequent and substantive feedback

Below are the most widely held examples for HIPs from AAC&U:

HIP_tables (1)-page-001

On the NSSE website, you can build your own report with the data they’ve collected in 2013 and 2014 – so fun!! Give it a try and sift through it to review the fun findings. Have I mentioned FUN!

Ashley Finley (on behalf of the AAC&U) provides some brief (though important) thoughts on proper execution of HIPs:

Other Videos to Watch (or more likely, just listen to in the background while you work on something else and occasionally look at):

  • George Kuh presentation about HIPs:

  • Ashley Finley’s plenary presentation about integrative learning:

What high impact practices are you working within? Where have you found success?