Assessment Summer Retreat

2015-03-14_OhNoLogo22-abby3Hello assessment friends! The summer has been fun, but flying by. It kicked off with Mark’s Bachelor Party (in Niagara Falls), and then wedding – there were beautiful vows, dancing and pie; I was a groomsmaid; a rooster loudly crowed throughout the ceremony; and it was a happy day.

mark and kate

Congratulations Mark & Kate!

Then, I went to Disneyland and it was truly magical.

disney

And now it’s time to get back to work! 

What I hear from people all the time is that they’ve done so much good and diligent work throughout the year collecting program attendance numbers, and student feedback and surveys, but with no time (and maybe some analysis paralysis) to dig into what does all that delicious data say about what students learned.

If you’re like me, you work amongst thoughtful, passionate, student-centered professionals. And, summer in the office is THE time to reflect, work on projects, and retreat. So, bring all of these elements together – your student usage/engagement data/feedback, great colleagues, and summer reflection time – to tackle the question of how did all of our initiatives and efforts with students impact their learning?

Next week my colleagues and I will be retreating (well…not actually away from campus, but you get what I mean) to do just this: review our office goals and student engagement data, figure out what it all means, and strategize and vision for the coming year. Specifically, we’ll be focusing on:

  • What patterns you see in our student usage data?
  • What do the data and patterns say to you about how students engaged with our initiatives and services?
  • How did all of our initiatives and efforts with students impact their learning?
  • How does our student engagement data relate to our annual office goals?

Don’t be afraid to not have all the answers, and don’t be afraid to struggle as a group through these types of questions. This will help you make data- and assessment-driven decisions that ultimately help students (and, hopefully, it will be a little fun too!).

Happy retreating!

Advertisements

I collected data! Now what?!

Abby photoWe’re coming to the close of yet another academic year and you did it! You surveyed students or tracked who did (and didn’t!) visit your office or understood the student learning outcomes from a program or whatever we keep preaching about on this blog. But, now what???? If you read any assessment book, at this point there are common next steps that include things like “post-test” and “close the loop” and a bunch of other common (and good!) assessment wisdom. But sometimes that common assessment wisdom isn’t actually helping any of us professionals DO something with all this data. Here are a few things I do with my data before I do something with my data:

  1. Share the data in a staff meeting: Your colleagues may or may not be formally involved in the specific program you assessed but they work with the same students, so they’ll be able to make connections within student learning and to other programs/services that you’re missing. Ask them about the themes they’re seeing (or not seeing!) within the data. It’ll help you clarify the outcomes of your data, bring more people into the assessment efforts in your office (more heads are better than one!), and it’s a nice professional development exercise for the whole team. Teamwork makes the dream work!
  2. Talk to peer colleagues about their version of the same data: Take your data* to a conference, set up a phone date with a colleague at a peer school, or read other schools’ websites. Yes, you’ll likely run into several situations that aren’t directly applicable to yours, but listen for the bits that can inspire action within your own context.
  3. Take your data to the campus experts: Know anyone in Institutional Research? Or the head of a curriculum committee? Or others in these types of roles? These types of people work with the assessment process quite a bit. Perhaps take them to coffee, make a new friend, and get their take.
  4. Show your data* to student staff in your office: Your student staff understand the inner workings of your office AND the student experience, so they’re a perfect cross section of the perspective that will breathe life into the patterns in your data. What do they see? What data patterns would their peers find interesting? What does it mean to them?

WOW, can you tell I’m an extrovert?! All of my steps include talking. Hopefully these ideas will help you to not only see the stories of student learning and programmatic impact in your data, but also to make the connections needed to progress toward closing the loop.

* This goes without saying, but a reminder is always good; make sure to autonomize the data you show students and those outside of your office/school!

Assessment Pro Tips: NEEAN Conference

2015-03-14_OhNoLogo22-abby3Last week I was at the Northeast Educational Assessment Network (NEEAN) fall forum at College of the Holy Cross. Wow – what an excellent conference! The theme of the conference focused on the past, present, and future of assessment in higher education.

I co-presented with two incredible professional colleagues (see photo below): Carol Trosset (Associate Director of Institutional Research & Assessment, Carleton College) and Holly McCormack (Dean of Field Work Term, Bennington College) on assessing the liberal arts and its preparation for life after college via internships. Carol brought together the work she and Holly had been doing at Bennington with projects we’re in the midst of at Carleton to make this presentation. We had a such a great audience who brought insightful questions and ideas. Loved it!

NEEAN collage

The keynote speaker, Steve Weisler, gave an excellent presentation and concurrent session about assessment’s present and future. I took furious notes; here’s what stuck out to me:

  • Assessment means riding the bike while building it
  1. Treat student learning outcomes (SLO) as an inquiry question – assessment is a process of inquiry NOT a committee report
  2. Assessment and SLOs need TIME to show their real value, similar to discipline-specific research
  3. Reconcile the fact that assessment needs lots of time with the fact that we need to be presenting/showing progress now
  • Focus on making sure we have the appropriate learning goals because they will shape the conversation
  1. SLOs need to have variables that are sensitive to what truly differentiates a student at the beginning and end of college (e.g., Is “critical thinking” the appropriate measure? Or is it focusing progress on the wrong metric?)
  • Content cannot be the main measure of learning
  1. Students will forget so much of the information-specific content they acquire, thus we need to focus more on capturing the larger learning happening in its midst
  • Don’t let perfect be the enemy of good: Assessment needs to start somewhere
  1. Be practical on your start, and then as you implement your assessment plan re-examine if your goals and strategies are in alignment
  2. You want quality SLOs over quantity – start small and simple and then grow into it
  3. You won’t be able to start if you’re constantly problematizing your process

A big THANK YOU to my co-presenters Carol and Holly for a meaningful collaboration and presentation, and to NEEAN and Steve Weisler for such a hearty, learning-dense conference.

Will You Be a Guest Blogger?

2015-03-14_OhNoLogo22-abby32015-03-14_OhNoLogo22-mark3When we started Oh No our hope was to have one LARGE conversation about assessment. Thusfar, it’s mainly been us talking to ourselves – which is fun but not achieving our goal.

We want to expand the conversation about assessment in higher education, and the best way to do that is to invite creative, innovative professionals to help take the conversation further. We have lots of smart professionals in our lives already who are doing amazing things in various areas of higher education (see some of them below!).seal

mac n joes

These friends of ours (and others who we don’t even know yet [i.e., hopefully YOU!]) will be adding their perspective in the coming weeks.

We’d love for you to add your voice and fill in the gaps that we are missing. If you’re interested in adding to the assessment conversation we’ve started, let us know by filling out the form below.

Sending you much assessment power, 

Abby and Mark
mark and abby

Assessment Conferences

2015-03-14_OhNoLogo22-abby3Assessment heads – happy almost November! There is so much assessment in November that I’m looking forward to, most specifically:

  1. Higher Education Assessment Friendship Summit (i.e., Mark + Abby + our incredible group of friends = meeting of the minds in the same time zone!), and
  2. my group presentation at the New England Educational Assessment Network (NEEAN – try saying it 3 times fast) fall forum at College of Holy Cross.
Stay tuned next week for Oh no, Friendship Summit edition. For this week, NEEAN!
I wrote last time about great assessment collaborations and the NEEAN presentation is one result of those. You’ve all read our many, many, many (did I mention many) posts about learning goals/outcomes. My office ties several of our programs and services to nine student learning goals, and we’re gearing up to do that on an even broader scale. The Associate Director of Institutional Research and Assessment at Carleton (Carol Trosset) has been invaluable as we move into this next phase.
At NEEAN, we’ll be exploring my office’s learning goals through one example of this expansion: our internship program. Students create learning goals and strategies prior to their summer internship. They write reflections during and after the internship about their learning, in order to capture their outcomes. Carol has been helping us code students’ goals so that we may understand on a larger scale what students intend and seek to learn prior to their internship. At NEEAN, we’ll be comparing Carleton’s process to the great process at Bennington College, where Carol examined the outcomes of student experiential learning. I can’t wait to learn more about the Bennington process and get inspired by so many my assessment professionals.
Next up on the home campus front, we’ll conduct focus groups with student interns about their learning outcomes. Over the next few weeks, I’ll have pre-interviews with student interns to start structuring the focus groups. So much great assessment happening – stay tuned for more assessment fun!

Assessment Takes a Village: The Power of Collaboration

2015-03-14_OhNoLogo22-abby3Collaboration is an awesome thing. I’ve been working with various people and departments on campus on some exciting assessment projects over the last year. Good assessment takes a village; I can’t do it alone. It’s been a pleasure and a gift to benefit and learn from all the talent of my campus colleagues.

Here’s an overview of just a few of these projects:

  • Institutional Research: The associate director of IR and I have been working on all sorts of projects. One such project focuses on the learning goals our summer interns created prior to their internship. She analyzed these learning goals, coded the goals into overarching themes, and (eventually) will examine how those themes overlap with learning goals in the classroom (a project and expertise she’d initiated at a previous institution). 
  • ITS: Wow…ITS has helped our office with a number of incredible projects. Too many to list! They built a digital pipeline from our internal counseling note system (i.e., Symplicity) to the College’s data warehouse (read: so much data mining potential!!). And, they built us an interactive online career development tool for students – it’s an organizer, planner, and tracker all in one. Students can see our Career Center learning goals and the programs/services tied to each, select which they’d like to complete, pick a date they’d like to complete it by, and then check it off to track their progress. Did I mention, WOW?!?!
  • College Communications/Marketing: College communications took our survey data about student interns and created some really beautiful data visualizations. I don’t have an actual proof to show you yet but the concept comes from *TIME Magazine. The original piece (see photo) focuses on income brackets, whereas, ours focuses on interns by year and shows data about students and their internships (e.g., geographic location of internship, top internship industries, etc.) along the sides surrounding a photograph of the student. 
    Time Intern Posters
  • Mathematics/Statistics Department: This academic department offers a statistics elective called Statistical Consulting – the class organizes students into consulting groups, takes on actual organizations from the community as clients, and helps the organization address their real current issues using data. The Career Center was a client – the student group reviewed our data about student visits and helped us better understand which students we’re seeing, how often, and during which weeks in the year. Conversely, we also have a better understanding of who we’re not seeing. Valuable insights from this collaboration.

There are MANY other great departments and people collaborating with the Carleton Career Center; these are just a few from the last year. Assessing learning can be a blessed mess, so an ENORMOUS thank you to all the many people and offices who helped us achieve so many of our assessment goals. We couldn’t have done it without you!

With whom are you collaborating??

*Barone, E. (2014, September 8). Who we are. TIME, 184, 53-58.