A new paradigm has emerged in residential education in recent history. This paradigm, referred generally as the “residential curriculum” approach flips the script on how we view the role of residence halls in the life of the university. While the curriculum approach is newer, the concept of learning happening outside the classroom is not, necessarily. The curriculum approach extends the concept of outside-the-classroom learning and utilizes the residence hall setting as a laboratory of student learning.
The traditional approach to learning in the residence halls can be summarized in the programmatic approach, where student staff (e.g., Resident Assistants or RAs) lean on active programs which could be educational in nature, but in many cases are purely social. Now hear me out—I am not against social programming. Social programming is necessary and needed to build relationships among residents towards the development of a community of depth.
The new curriculum approach takes some cues from our friends in academic affairs to start primarily with learning outcomes (i.e., what we want residents to learn). Once the learning outcomes are determined, it is necessary to identify various strategies which could help you achieve those learning outcomes. After the strategy is utilized comes the last, and most perplexing processes, of assessing if the learning outcome has been met, thereby assessing if student learning has taken place.
I’ve been grateful to work with learning-outcomes based models at three different institutions, now in my eighth year working in residence education. And at all three institutions I’ve observed the development of learning outcomes and strategies, but found the last step of assessment continuously perplexing to student affairs professionals in these contexts. We’ve mastered assessment of student satisfaction, but have found it much harder to quantify, or qualify, student learning. Why is that?
There are several dilemmas or barriers I’ve encountered in attempting to gage student learning. First, many times our student staff are the ones who are carrying out our curriculum. In their role as peer educators, can they effectively assess the learning of their peers when this is something student affairs professionals have been trained to do? Second, time. It is hard enough to get students to attend an educational program, and adding on an additional “last step” of an assessment survey can be asking a lot of our students who have volunteered their own time to attend our program. What’s the incentive to complete the survey? For many students, there isn’t one. Third, many of our attempts at assessment often fail and lean more toward anecdotes rather than valued evidence of learning. Wouldn’t it be great if, in a perfect world, we could have students stick around for a focus group to ask pointed questions which would help us to illicit if, in fact, learning took place?
Recently, I’ve been excited and grateful to join the Housing & Residence Life team at the University of Dayton in Dayton, Ohio. One of the decisions that played into my interest in applying for and eventually accepting my current position is the work my office is doing to promote student learning in our on-campus communities. Dayton is a highly residential campus (78%) and where students live is a central, even defining, experience to students. In an effort to better leverage this affinity for living on campus, we sought out to incentivize student learning in August 2014, a year before I started working there. We incentivize student learning by making the strategies of our residential curriculum (i.e., community meetings, one-on-ones, roommate agreements, programs) worth 1 point. The more points students accumulate, the greater chances they will receive favorable housing for the next academic year. We no longer have a housing lottery. Instead, we track student points by participating in opportunities for student learning, and apply those points to our student assignments process. This puts students in the driving seat of their housing assignment process. The main dilemma this could pose is something I thought about immediately when interviewing for my current job: Are students really going to programs and “engaging in learning” because they want to, or because they just want the points? What I began to realize is that the student’s motivation for attendance isn’t really what we care about. What we care about is the experience and, hopefully, the learning that takes place through their attendance and participation in our learning opportunities. BUT how do we gage if that learning is actually happening? Hence the dilemma. What we may have on our side is the dangling carrot of attaining points. Could we ask students to participate in their learning experience AND then complete a short “assessment of their learning” in order for them to attain their points? This may enable us to collect some valid data which could help us to demonstrate that students are learning in their residential environment, regardless of why they are there.
Matt Kwiatkowski is the Assistant Director of Residence Life at the University of Dayton in Dayton, Ohio. Please leave your comments below or feel free to contact Matt via email at email@example.com.