Where do I begin?


2015-03-14_OhNoLogo22-mark3It starts when you notice the local construction projects winding down. Then you’re cut off by a Ford Escape with New Jersey plates and a back full of clothes, books, a pink rug, and one of those chairs made solely of bungee cords. That’s right, it’s back to school season.

Abby and I met to for a pre-season re-vamp of Oh No and you can look forward to two posts a week this year — Mondays and Thursdays. We’re trimming a bit because those Friday posts came upon us awful fast and we want to keep this thing valuable. So without further adieu…

I met with a colleague from across campus this week. She works for a newer (and smaller) office just starting to wrap its mind around how to capture its value to the students it serves. The office focuses on developing an entrepreneurial mindset in our students and supporting student ideas from conceptualization to implementation. To further complicate its assessment process, the office is not yet on permanent funding, thus is under pressure to justify its existence.

I’ve already covered starting over in my Zero to Assessment post, however this conversation yielded a few new questions I wanted to chew on a bit.

What if I don’t know what students are learning from this experience? In the old, dusty textbooks of assessment you’ll find a flow chart looking something like this…


oh, well hello smart art…

This flow chart is helpful if you have clear and measurable learning outcomes, but leaves out instructions for when your outcomes are a bit cloudy. My colleague proposed measuring this through a series of qualitative questions — which, despite my aversion to the labor intensive nature of properly analyzing qualitative questions, seemed appropriate given the situation. And you know what, old dusty textbook I made up to illustrate my point, if an office centered around innovation can’t build a plane while they’re flying it, can any office? That is, if we can’t get an initiative started until we have every detail (e.g., assessment) ironed out, we’ll be missing out on a good number of valuable initiatives.

While I’m complaining about the rigidity of fictitious textbooks, it’s worth acknowledging that neither her nor I was all too sure of how she would analyze the data she’s collecting. It would be great if she had the labor to code each response, but that doesn’t seem likely. I think this is okay. It takes a few cycles to get an assessment process ironed out. Even by simply reading through the responses, she’ll get a feel of what her students are learning and how to better support them.

How do I get students to reply to my surveys? If I ever figure this out, I’m leaving the profession of higher education to hang out with the inventor of stick it notes on an island covered in boats, flat screen TVs, and Tesla convertibles. And, I guess, charging stations for the convertibles.

Very few people know how to do this well, however I’ve come across a few strategies which seem to be working.

-Make it personal. More than half of my job is forming relationships with students. Surveys are one of the times I leverage those relationships. I’ll often send the survey link in an e-mail noting (very briefly) the importance of the data collected in this survey, letting them know that every response is a favor to me (for all of the time I spend e-mailing them with answers to questions I’ve already answered in previous e-mails, this is the least they could do). If you’re sending a survey out to thousands, you can expect a very low return rate.

-Get ‘em while they’re captive. Do you have advising meetings with students at the start of these programs? is there an application to get into your program? Can you (easily) tie in the survey as a requirement for completing the program? I don’t mean to hint that surveys are the only means of collecting assessment data — but they’re direct, effective, and tend to be less labor intensive than other means.

Countdown to College football: 3 DAYS!

Stay the Course: Reminders for When Assessment Gets Messy

2015-03-14_OhNoLogo22-abby3My friends for the assessment revolution! My office is gearing up to take the next step in our learning outcomes assessment efforts. I’m VERY excited! It’s going to be fun, intellectually and professionally fulfilling, and (most importantly and hopefully) provide meaningful insight into the student experience. But in addition to excitement, I am also a bit nervous, because, as you’ve likely noticed, measuring for learning is messy – which is the largest part of its difficulty, but, also, its beauty. In my research about student learning and assessment over the past few years I’ve come to learn that it’s not just me who’s feeling this way:

In watching videos like the above and reading anything I can get my hands on, I’m hearing a few common themes (some old, some new) that I’m keeping in mind during this big year for our assessment efforts in the Career Center:

  1. Assess learning not just once, but at multiple different points and from different parts of the student experience. (read: Learning is happening all over campus, thus, assessing learning all over campus is not just a good idea, but needed.)
  2. Give students multiple opportunities to practice their learning in high-touch, intentional, reflection-centric ways. (read: It’s going to take a lot of time, there’s no quick fix, so settle in for the long haul and love the process.)
  3. Assessment tells the story of student learning, but let the student be the narrator. (read: Ask students to narrate their learning and they will tell you! Their story IS your assessment data. Now use that data to tell the larger story of student learning at large.)
  4. Set up assessment to do double duty for you – it can be a learning tool in it of itself, in addition to a data collection. 

    “…a really interesting kind of analytics should reveal to the learner even more possibilities for their own connected learning. The analytics shouldn’t simply be a kind of a diagnosis of what’s happening now but analytics at their best can be a doorway that suggests what else is possible.” -Gardner Campbell, Vice Provost for Learning Innovation and Student Success at Virginia Commonweath University

  5. Follow best practices in assessment while also breaking the mold, because learning’s blessed messiness means it’ll always need more than the gold standard. (read: Follow and break the “rules” of assessment at the same time – simple, right????)

It might be a messy year in assessment, but that’s ok, because it’s a worthwhile pursuit. And as my supervisor reminded me when I was wigging out about it recently: remember, nothing ventured, nothing gained.

So commit to the adventure and just do it.

Top Tier vs Lower Tier Engineering Programs

2015-03-14_OhNoLogo22-mark3Greetings everyone! In the past few weeks, I purchased a house. The buying process took up much of my free time (though didn’t seem to diminish my blog-ambition). Anyway, with summer in the home-stretch it’s time to get back into it. Abby and I will have our regular weekly posts returning soon. In the meantime, my brain is in the depths of “big thinking” mode. As a former engineer, and an advisor of first-year engineering students, I often find myself thinking about how we educate them.

Modern society holds complex problems. No longer is it enough for engineers to create a widget, present it to the world, and say “go forth, use this widget to make your life easier!” Beyond the question of making things faster or more-efficient, we want to know if a widget intended for a developing nation can be manufactured with local materials at low cost. Will the locals even want to use the widget?

In addition, we’re increasingly reliant on our engineers’ moral decision making. So you want to build a car that drives itself? How much will one of these cost? Can it share the road with human-driven automobiles? If the car finds itself in a certain-impact situation, should it rear-end the car in front of it or veer off onto the sidewalk? Our engineers should have the technical toolbelt to needed to design the widget as well as development required to decide how to use those tools.

This takes me to the state of today’s engineering education. Each engineering degree has a set of technical content expected of a person with that degree. The technical content takes up most of what can be covered in four years. This leaves little room for personal/moral/ethical development in the curricular portion of a student’s experience, and has us (student affairs folks) urging students to get involved outside of the classroom.

Students arrive to college with varying levels of abilities and backgrounds. The most desirable students tend to have credit from Advanced Placement courses or college courses they took while in high school. Assuming they plan to stay for four years, completing college requirements while in high school leaves students with more “wiggle room” in their schedule to pursue interests in addition to the bachelor’s degree. This may mean a minor or even taking fewer credits each term to allow for more extra-curricular time.

But what about the students who do not arrive with AP credit — the students who have the grit and determination, but have not yet studied calculus and may not have stellar standardized test scores? These students are often overlooked by the more competitive institutions. When you’re looking at thousands of applications, you need a means of sorting through them quickly.

We’re left with two tiers of institutions. Students accepted into the lower tier of institutions are not coming in with much college-level coursework completed. Since those students will be expected to have a certain toolbelt upon graduation, those programs are forced to put their resources into the classroom experience. Students accepted into the upper tier of institutions – already having college credit – have the room to take a minor in philosophy or spend time working on research in a professor’s lab.

What’s the difference? Top-tier students have the time for extra-curriculars which force them to juggle the more complex questions, often lead to the personal/moral/ethical development required of today’s engineers. Lower tier students are forced to put their time into learning technical content. They’re acquiring the tools but are not challenged to think about how to use them.

Both of these groups finish with bachelor’s degrees, the signal to employers that they’re ready for the workforce. What happens next? My gut tells me that top-tier students, equipped with experience in how to use the tools, are moving into leadership positions at a higher rate than their lower tier colleagues.

The questions I keep coming back to: Is the personal/moral/ethical development part of what is (or should be) a part of the bachelor’s degree? With the high cost of higher education, are we limiting the potential of some students by not allowing them the opportunity for that development? Does our current educational system, which nudges affluent students toward leadership, exacerbate the gap between the upper and lower socioeconomic classes?

Assessment on the Road: Boise State

Images of Idaho

My summer of travel is sadly coming to an end. I just got back from one of my last destinations: Idaho! (Boise, to be exact) It was my inaugural trip and I’m happy to report back what so many already know: Idaho is beautiful. (Thanks Wear Boise for the great beard image!)

Being the lover of colleges and campuses that I am, I had to visit Boise State University while I was there. By all of the extra signage around campus, clearly I was there on a summer orientation day (which I love because I love orientation).

Boise Career Center welcome

Once again, the Career Center of this new-to-me campus grabbed my attention. I REALLY loved their graphic representation of their career learning goals. What a great way to be transparent about and engage students in what goals the Career Center has for students. I think that is a pivotal step to students actually successfully learning and achieving those goals!

CC make it count detail

Another data visualization from Boise State that I loved was through their admission office. Here’s what I love about it:

It’s simple. Easy graphics, consistent color palette. Clean.

BSU data brochure5

It has numbers AND text. They bring life and strength to each other.

BSU data brochure

There is lots of different kinds of data. I think more and more the public wants LOTS of data all at once so that they can quickly skim and find what is most meaningful to them.

BSU dad brochure2

It’s a nice size. Folds up as a small brochure (6″x3″) and folds out as a small poster (18″x12″).

BSU data brochure3

It’s a nice paper weight. I know, this sounds weird, but if you’re going to be printing nice graphics, you want to have nice paper on which to put them.

Thanks for the great visit Boise State U! Until next time!