Best of 2015 Assessment Style

2015-03-14_OhNoLogo22-abby32015 was a big year for Oh No. In our first almost year of chatting assessment with you all we’ve written 54 posts, reached over 2,000 of you, and learned a lot along the way (like, posting 3x/week is a bit much for everyone…). I love end of the year time when all kinds of media go into review mode, highlighting big moments of the year. So in kind, voila

Oh No‘s Most Popular Posts of 2015:

4-stay course3-yikyak assess eval what doing 2-Assess Eval Research

And our most popular post of 2015 :::drum roll:::

1 - Incentiv Res Curr

Thank you Matt Kwiatkowski for your post!

THANK YOU so much for your readership this year! We are SO thankful for your support and look forward to talking more assessment with you all in 2016!

Outside of the Oh No-sphere, check out my 2 favorite year-end reviews:

HAPPY NEW YEAR!

What Pitch Perfect 2 Says About College

2015-03-14_OhNoLogo22-mark3I found myself watching Pitch Perfect 2 last night while ironing. I’m not embarrassed — those shirts wouldn’t de-wrinkle themselves.

Most of the time, I’m in a world where student affairs is an accepted norm. Several of my local friends are higher ed professionals, many of my more distant friends are student affairs pros. After a while, I stopped noticing how my perspective on higher education might not line up with society’s view.

Often times, what we see in movies or on TV is clearly an adjustment with television in mind. Remember Saved by the Bell, the College Years? Nobody lives in a space that large their first year in college. How about the entire National Lampoon’s Animal House movie? Sure, there are elements of a “typical” college experience in there, but for the most part, it’s a parody. Most viewers know these examples are not what college looks like today; that some adjustments were made to make the show more viewable, or more humorous.

Then I saw Pitch Perfect 2. Yes, it’s a comedy, and many scenes fall into the pattern of “we’re stretching reality quite a bit here, but doing it in the name of comedy!” But the movie made some, more subtle hints about how we view higher education. Consider these examples.

The Welcome to College scene. Early on in the movie, the main character attends a commencement event in a large lecture hall. The type of event designed for new students that happens right as a semester is beginning. In this scene, one administrator is the host, bringing different student groups out to perform. The scene resembles a high school pep rally for a homecoming football game. Where were the common themes of you will be challenged! or expand your horizons!? Nowhere. The implication was that the students in the audience were in a new environment with a specific image of what’s expected of them — that they join an a cappella group. Success means conforming to the expectations of the college. The growth those students can expect has a very narrow definition.

The Student Affairs Discipline scene. In this scene, Anna Kendrick’s a cappella group met with a Student Affairs dean and the announcers from an a capella competition after a wardrobe malfunction on a nationally televised event. The role of the announcers is a bit unclear, let’s consider them representatives from the a capella league. The Dean of Student Affairs in this scene acts as a strict disciplinarian. The group members walk into his office and stand there while he tells them of their punishment. The members have little opportunity to talk — their fate has already been decided.

These scenes reminded me that in many ways, institutions are viewed as gatekeepers of your future. If you get in, you’ll succeed. If you can keep up with the rigor, you’ve made it. Your success depends on whether you do what they ask of you. Of course, we view the experience as a mutual effort — the institutions provide opportunities to grow, the students choose  the opportunities in which to engage (and the extent to which they engage).

It’s movies like this that remind me why students are so focused on high exam scores, and the “right” set of extra-curriculars. Between the movies and the countless articles on top money-earning majors (etc.), college seems much more a place where you collect merit badges than a place of growth. I got this badge because I attended [insert prestigious institution] University, and this badge because I got that minor in ______. This one for the dean’s list. Oh, and I got this one as captain of the _____ club.

This view of the working world completely ignores the idea that students can craft their own future from their values and the talents they develop; and bases itself on an environment where their future employers hold all the cards and need to be impressed if they have any hopes of a job.

Somehow, I’ve meandered from the view that high school students have of college to career preparation. I suppose my point here is that there is a misalignment between how we (higher ed professionals) view the role/purpose of college, and how the general public views college — and that the difference severely impedes a students’ ability to get the full value out of college.

Guest Blogger: Incentivizing the Residential Curriculum

IMG_0466 (3)A new paradigm has emerged in residential education in recent history. This paradigm, referred generally as the “residential curriculum” approach flips the script on how we view the role of residence halls in the life of the university. While the curriculum approach is newer, the concept of learning happening outside the classroom is not, necessarily. The curriculum approach extends the concept of outside-the-classroom learning and utilizes the residence hall setting as a laboratory of student learning.

The traditional approach to learning in the residence halls can be summarized in the programmatic approach, where student staff (e.g., Resident Assistants or RAs) lean on active programs which could be educational in nature, but in many cases are purely social. Now hear me out—I am not against social programming. Social programming is necessary and needed to build relationships among residents towards the development of a community of depth.

The new curriculum approach takes some cues from our friends in academic affairs to start primarily with learning outcomes (i.e., what we want residents to learn). Once the learning outcomes are determined, it is necessary to identify various strategies which could help you achieve those learning outcomes. After the strategy is utilized comes the last, and most perplexing processes, of assessing if the learning outcome has been met, thereby assessing if student learning has taken place.

I’ve been grateful to work with learning-outcomes based models at three different institutions, now in my eighth year working in residence education. And at all three institutions I’ve observed the development of learning outcomes and strategies, but found the last step of assessment continuously perplexing to student affairs professionals in these contexts. We’ve mastered assessment of student satisfaction, but have found it much harder to quantify, or qualify, student learning. Why is that?

There are several dilemmas or barriers I’ve encountered in attempting to gage student learning. First, many times our student staff are the ones who are carrying out our curriculum. In their role as peer educators, can they effectively assess the learning of their peers when this is something student affairs professionals have been trained to do? Second, time. It is hard enough to get students to attend an educational program, and adding on an additional “last step” of an assessment survey can be asking a lot of our students who have volunteered their own time to attend our program. What’s the incentive to complete the survey? For many students, there isn’t one. Third, many of our attempts at assessment often fail and lean more toward anecdotes rather than valued evidence of learning. Wouldn’t it be great if, in a perfect world, we could have students stick around for a focus group to ask pointed questions which would help us to illicit if, in fact, learning took place?

Recently, I’ve been excited and grateful to join the Housing & Residence Life team at the University of Dayton in Dayton, Ohio. One of the decisions that played into my interest in applying for and eventually accepting my current position is the work my office is doing to promote student learning in our on-campus communities. Dayton is a highly residential campus (78%) and where students live is a central, even defining, experience to students. In an effort to better leverage this affinity for living on campus, we sought out to incentivize student learning in August 2014, a year before I started working there. We incentivize student learning by making the strategies of our residential curriculum (i.e., community meetings, one-on-ones, roommate agreements, programs) worth 1 point. The more points students accumulate, the greater chances they will receive favorable housing for the next academic year. We no longer have a housing lottery. Instead, we track student points by participating in opportunities for student learning, and apply those points to our student assignments process. This puts students in the driving seat of their housing assignment process. The main dilemma this could pose is something I thought about immediately when interviewing for my current job: Are students really going to programs and “engaging in learning” because they want to, or because they just want the points? What I began to realize is that the student’s motivation for attendance isn’t really what we care about. What we care about is the experience and, hopefully, the learning that takes place through their attendance and participation in our learning opportunities. BUT how do we gage if that learning is actually happening? Hence the dilemma. What we may have on our side is the dangling carrot of attaining points. Could we ask students to participate in their learning experience AND then complete a short “assessment of their learning” in order for them to attain their points? This may enable us to collect some valid data which could help us to demonstrate that students are learning in their residential environment, regardless of why they are there.

Matt Kwiatkowski is the Assistant Director of Residence Life at the University of Dayton in Dayton, Ohio. Please leave your comments below or feel free to contact Matt via email at mkwiatkowski1@udayton.edu. 

Assessment Pro Tips: NEEAN Conference

2015-03-14_OhNoLogo22-abby3Last week I was at the Northeast Educational Assessment Network (NEEAN) fall forum at College of the Holy Cross. Wow – what an excellent conference! The theme of the conference focused on the past, present, and future of assessment in higher education.

I co-presented with two incredible professional colleagues (see photo below): Carol Trosset (Associate Director of Institutional Research & Assessment, Carleton College) and Holly McCormack (Dean of Field Work Term, Bennington College) on assessing the liberal arts and its preparation for life after college via internships. Carol brought together the work she and Holly had been doing at Bennington with projects we’re in the midst of at Carleton to make this presentation. We had a such a great audience who brought insightful questions and ideas. Loved it!

NEEAN collage

The keynote speaker, Steve Weisler, gave an excellent presentation and concurrent session about assessment’s present and future. I took furious notes; here’s what stuck out to me:

  • Assessment means riding the bike while building it
  1. Treat student learning outcomes (SLO) as an inquiry question – assessment is a process of inquiry NOT a committee report
  2. Assessment and SLOs need TIME to show their real value, similar to discipline-specific research
  3. Reconcile the fact that assessment needs lots of time with the fact that we need to be presenting/showing progress now
  • Focus on making sure we have the appropriate learning goals because they will shape the conversation
  1. SLOs need to have variables that are sensitive to what truly differentiates a student at the beginning and end of college (e.g., Is “critical thinking” the appropriate measure? Or is it focusing progress on the wrong metric?)
  • Content cannot be the main measure of learning
  1. Students will forget so much of the information-specific content they acquire, thus we need to focus more on capturing the larger learning happening in its midst
  • Don’t let perfect be the enemy of good: Assessment needs to start somewhere
  1. Be practical on your start, and then as you implement your assessment plan re-examine if your goals and strategies are in alignment
  2. You want quality SLOs over quantity – start small and simple and then grow into it
  3. You won’t be able to start if you’re constantly problematizing your process

A big THANK YOU to my co-presenters Carol and Holly for a meaningful collaboration and presentation, and to NEEAN and Steve Weisler for such a hearty, learning-dense conference.

Will You Be a Guest Blogger?

2015-03-14_OhNoLogo22-abby32015-03-14_OhNoLogo22-mark3When we started Oh No our hope was to have one LARGE conversation about assessment. Thusfar, it’s mainly been us talking to ourselves – which is fun but not achieving our goal.

We want to expand the conversation about assessment in higher education, and the best way to do that is to invite creative, innovative professionals to help take the conversation further. We have lots of smart professionals in our lives already who are doing amazing things in various areas of higher education (see some of them below!).seal

mac n joes

These friends of ours (and others who we don’t even know yet [i.e., hopefully YOU!]) will be adding their perspective in the coming weeks.

We’d love for you to add your voice and fill in the gaps that we are missing. If you’re interested in adding to the assessment conversation we’ve started, let us know by filling out the form below.

Sending you much assessment power, 

Abby and Mark
mark and abby

Assessment Conferences

2015-03-14_OhNoLogo22-abby3Assessment heads – happy almost November! There is so much assessment in November that I’m looking forward to, most specifically:

  1. Higher Education Assessment Friendship Summit (i.e., Mark + Abby + our incredible group of friends = meeting of the minds in the same time zone!), and
  2. my group presentation at the New England Educational Assessment Network (NEEAN – try saying it 3 times fast) fall forum at College of Holy Cross.
Stay tuned next week for Oh no, Friendship Summit edition. For this week, NEEAN!
I wrote last time about great assessment collaborations and the NEEAN presentation is one result of those. You’ve all read our many, many, many (did I mention many) posts about learning goals/outcomes. My office ties several of our programs and services to nine student learning goals, and we’re gearing up to do that on an even broader scale. The Associate Director of Institutional Research and Assessment at Carleton (Carol Trosset) has been invaluable as we move into this next phase.
At NEEAN, we’ll be exploring my office’s learning goals through one example of this expansion: our internship program. Students create learning goals and strategies prior to their summer internship. They write reflections during and after the internship about their learning, in order to capture their outcomes. Carol has been helping us code students’ goals so that we may understand on a larger scale what students intend and seek to learn prior to their internship. At NEEAN, we’ll be comparing Carleton’s process to the great process at Bennington College, where Carol examined the outcomes of student experiential learning. I can’t wait to learn more about the Bennington process and get inspired by so many my assessment professionals.
Next up on the home campus front, we’ll conduct focus groups with student interns about their learning outcomes. Over the next few weeks, I’ll have pre-interviews with student interns to start structuring the focus groups. So much great assessment happening – stay tuned for more assessment fun!

Assessment Takes a Village: The Power of Collaboration

2015-03-14_OhNoLogo22-abby3Collaboration is an awesome thing. I’ve been working with various people and departments on campus on some exciting assessment projects over the last year. Good assessment takes a village; I can’t do it alone. It’s been a pleasure and a gift to benefit and learn from all the talent of my campus colleagues.

Here’s an overview of just a few of these projects:

  • Institutional Research: The associate director of IR and I have been working on all sorts of projects. One such project focuses on the learning goals our summer interns created prior to their internship. She analyzed these learning goals, coded the goals into overarching themes, and (eventually) will examine how those themes overlap with learning goals in the classroom (a project and expertise she’d initiated at a previous institution). 
  • ITS: Wow…ITS has helped our office with a number of incredible projects. Too many to list! They built a digital pipeline from our internal counseling note system (i.e., Symplicity) to the College’s data warehouse (read: so much data mining potential!!). And, they built us an interactive online career development tool for students – it’s an organizer, planner, and tracker all in one. Students can see our Career Center learning goals and the programs/services tied to each, select which they’d like to complete, pick a date they’d like to complete it by, and then check it off to track their progress. Did I mention, WOW?!?!
  • College Communications/Marketing: College communications took our survey data about student interns and created some really beautiful data visualizations. I don’t have an actual proof to show you yet but the concept comes from *TIME Magazine. The original piece (see photo) focuses on income brackets, whereas, ours focuses on interns by year and shows data about students and their internships (e.g., geographic location of internship, top internship industries, etc.) along the sides surrounding a photograph of the student. 
    Time Intern Posters
  • Mathematics/Statistics Department: This academic department offers a statistics elective called Statistical Consulting – the class organizes students into consulting groups, takes on actual organizations from the community as clients, and helps the organization address their real current issues using data. The Career Center was a client – the student group reviewed our data about student visits and helped us better understand which students we’re seeing, how often, and during which weeks in the year. Conversely, we also have a better understanding of who we’re not seeing. Valuable insights from this collaboration.

There are MANY other great departments and people collaborating with the Carleton Career Center; these are just a few from the last year. Assessing learning can be a blessed mess, so an ENORMOUS thank you to all the many people and offices who helped us achieve so many of our assessment goals. We couldn’t have done it without you!

With whom are you collaborating??

*Barone, E. (2014, September 8). Who we are. TIME, 184, 53-58.

Should we Assess Mental Health?

2015-03-14_OhNoLogo22-mark3Of the students I meet with for academic difficulty, a startling proportion of them are in their present situation due to factors related to mental health. The shape of the issue varies. Sometimes it’s the stress of seeing everyone around them succeed — struggling students tend to be quiet about their performance. Other times it’s a depression that was mild and undiagnosed in high school, but starts to take hold in college. These students want to succeed and have the capacity to, but something about college life tosses a wrench in their ability to perform academically.

That I work at a competitive research 1 institution, most all of my students excelled in high school. Anyone who’s meeting with me for academic difficulty is seeing it for the first time, and rarely do they know how to cope.

These days, I’m hearing many in the education world discussing grit — the ability to overcome struggle and bounce back from failure. We’re even using the term in our new student orientation. At the same time that we’re telling our students “we want you to challenge yourself!” they know that their GPA is the first way they’re measured for their next phase of life (grad school, med school, employers, etc). Sure, overcoming adversity sounds cool, but it sure doesn’t feel good while it’s happening, and for someone who’s always succeeded, that first “C” on an exam can feel like the first crack in the dam.

It’s probably not much of a jump to conclude that a student experiencing mental health difficulty is more likely to struggle academically. And isn’t academic success part of the role of a fair number of student affairs offices? If we could identify the students struggling with mental health, we can go a long way towards supporting them through their journey.

The challenge here is that most student affairs practitioners (myself included) are not experts at diagnosing mental issues. Balance that with the fact that we (i.e., advising, housing, etc.) are often the first to notice a student is struggling. Aren’t we also the folks charged with supporting the successful transition of our students into the college environment?

I’m not quite comfortable claiming that we can be responsible in any way for the mental health of our students, or that fewer students with mental health challenges means that we’re succeeding, but I also believe that our response and support of these students should be a part of how our success is measured.

Data Storytelling with Fantasy Football

2015-03-14_OhNoLogo22-abby3Before 2010, I didn’t care AT ALL about sports. But, being the extrovert (and PROUD past-time bandwaggoner) that I am, I decided to get into football because that’s what people were talking about. So this Iowa girl started following the New Orleans Saints…a natural choice (former French teacher over here, remember? NOLA was the best I could do!).

In a similar vein, for the past 5 years, Mark, myself, and some of our friends from MiamiU have had a fantasy football league together.**

My team = the Tenacious Trouts

Mark’s (I’m using air quotes here) “clever” team = Co-constructing PAIN (Very student development theory of him…nice, Mark!)

Anyhoo…we use Yahoo Fantasy Football (YFF) and one feature that I’ve enjoyed this year is the Game Recap. Yahoo blends together highlights from the “game”, images, and data to tell the story of (in this rare case) my amazing upset against another team, Handy Mart (no air quotes for that team – she’s won multiple years in a row!).

summary ff

This game data recap makes reading about my fake team’s fake game much more dramatic and interesting than just the bunch of computer algorithms that it is.

sections of ff

It weaves the story of the game data together so accessibly that it makes even the more nuanced highlights and plays from the game exciting for a sports novice like myself. And, in thinking about collecting data and assessing learning, really, isn’t that one of the main goals? Lots of offices collect data – and while that’s by no means easy, I think the deeper challenge is what do you do with that data? And how do you tell the story of your data (i.e., what students learned and were able to do as a result of your efforts) to make it accessible to important stakeholders?

Data storytelling means I need to do more than show what % of students responded “agree” or “disagree” on a survey. I need to use the data to narrate what all those survey responses mean and the overarching story that arose. Practically, here are a few simple strategies Yahoo Fantasy Football uses that can apply to us. When you have a bunch of data:

  1. Cluster information into categories – not only will categories make your data much more digestible to your audience, but the groups in it of themselves will make telling the story easier for you and the audience.
  2. Use interpretive titles – show your data but also give it a title that helps the audience understand what they’re seeing/reading and what it means (the way a headline to an article quickly and succinctly communicates the main point).
  3. Blend images, text, and data together – there’s no need to exile all the graphs to one page and text to another. Instead, put them side-by-side so they can complement and strengthen each other.

Happy storytelling!

**I'd like to note that I have been the league champion ONE time! Again, a rare occurrence which probably was due to my opponents getting too busy to change their rosters, but I'll take it!

Statistics! Part 2

2015-03-14_OhNoLogo22-mark3So you’ve designed a new workshop, which you’ve guaranteed will bring up your student’s grades by a full letter! You spent weeks preparing the workshop, you gave the workshop, and now the grades are coming in. Did your students improve by a letter grade? That’s an easy calculation using descriptive statistics. Simply average last term’s GPAs among the students in your group and compare that to this term’s average.

But did your workshop really make a difference? Let’s say you want to know if your workshop can really be said to bring up student grades (or if you just got lucky). This is where inferential statistics come in! Remembering Abby’s post from last week, the sample in this case is the students in our workshop and the population is all of the students at the university.*

T-tests can tell you if a given experience (e.g., a workshop) impacts the mean score (e.g., grades) for a given student. When you hear folks describing “pre” and “post” tests, this is likely a scenario where t-tests are helpful.

Regression can tell you if (and the extent to which) two variables are connected. For example, if you want to know if a students grade in calculus can predict their grade in physics. A regression analysis will tell you is there’s a relationship and how strong that relationship is. This test is appropriate when both variables are quantitative.

Writing this post, it occurs to me that 1) trying to explain this stuff gets complicated FAST. 2) I’ve lost most of the details I learned in my statistics courses.

The main idea here is that we have mathematical tools which can calculate for us how likely it is that a given experience or situation can predict another experience or situation. Want to know if career counseling helps students find a career? Statistics can answer that.

The downside here is that not every statistic-based conclusion can be trusted. Much like Harry Potter’s wand,** this only works when a person knows (or at least sort-of knows) what they’re doing. I’ve noticed that it gets cold a few weeks after students arrive on campus — it’s the STUDENTS who cause winter!!!!

Correlation

In most cases, assessment doesn’t require statistics (beyond mean, median, etc.). As intelligent people with a limited amount of time on our hands, it’s okay to look at some numbers, make conclusions, and update our office processes. That said, if you happen to have someone on your staff with the time and the background, you’re in luck — you can start making conclusions about the effectiveness of your department practices. This allows you to identify the practices making a difference. In a time when resources are tight, the ability to carefully prune our student affairs bonsai trees (you’re welcome for that metaphor) will become more and more important.

*This assumes your workshop was attended by a random group of students among the university. If, for example, the workshop was only advertised to engineering students, then your population would be engineering students. In short (and probably under-sufficient), your population is the group from which the sample comes.

**This is just an assumption. I haven’t read any Harry Potter, but I assume he doesn’t want other people messing around with his wand.