Assessment Summer Retreat

2015-03-14_OhNoLogo22-abby3Hello assessment friends! The summer has been fun, but flying by. It kicked off with Mark’s Bachelor Party (in Niagara Falls), and then wedding – there were beautiful vows, dancing and pie; I was a groomsmaid; a rooster loudly crowed throughout the ceremony; and it was a happy day.

mark and kate

Congratulations Mark & Kate!

Then, I went to Disneyland and it was truly magical.


And now it’s time to get back to work! 

What I hear from people all the time is that they’ve done so much good and diligent work throughout the year collecting program attendance numbers, and student feedback and surveys, but with no time (and maybe some analysis paralysis) to dig into what does all that delicious data say about what students learned.

If you’re like me, you work amongst thoughtful, passionate, student-centered professionals. And, summer in the office is THE time to reflect, work on projects, and retreat. So, bring all of these elements together – your student usage/engagement data/feedback, great colleagues, and summer reflection time – to tackle the question of how did all of our initiatives and efforts with students impact their learning?

Next week my colleagues and I will be retreating (well…not actually away from campus, but you get what I mean) to do just this: review our office goals and student engagement data, figure out what it all means, and strategize and vision for the coming year. Specifically, we’ll be focusing on:

  • What patterns you see in our student usage data?
  • What do the data and patterns say to you about how students engaged with our initiatives and services?
  • How did all of our initiatives and efforts with students impact their learning?
  • How does our student engagement data relate to our annual office goals?

Don’t be afraid to not have all the answers, and don’t be afraid to struggle as a group through these types of questions. This will help you make data- and assessment-driven decisions that ultimately help students (and, hopefully, it will be a little fun too!).

Happy retreating!

Survey Results? 5’s All Over The Place

2015-03-14_OhNoLogo22-mark3A Friday post?!? Yeah. Who knows how this happened.

A year ago, I created a short program assessment for our Peer Advisor program. The idea: to capture the extent to which our Peer Advisors (PAs) learned what we wanted them to learn. Some of the learning outcomes were related to office function. for example: How comfortable do you feel answering the phone?, others were a bit more about whether they are drinking the advising Kool-Aid — apparently Kool-Aid is spelled with an “K” — things like: To what extent do you agree that professional development is important to you? Of course, there were Lickert (pronounced LICK-ert. I learned that at a conference once. I’ll tell you the story later).

The results you ask? 5’s. 5’s all over the place (out of 5). Half of the not-5’s were the result of 1 peer advisor who must (…oh god I hope) have chosen 1’s but meant 5’s. What do I make of this? With such a small sample size (we hire 10 PAs per year), a 4.78 and a 4.89 sound near-identical to me. I’d certainly be hesitant to conclude that the 4.78 is all that different from the 4.89. Deep within me is a fool shouting this must mean everything is perfect! My brain says otherwise. Maybe this assessment needs some work.

As a quick side note: I’m a firm believer in the Strongly Agree to Strongly Disagree scale. When you use a standard scale of measurement, you can compare one question to another — which can be useful.

So what’s a man/woman/assessmenteur to do?  I have a few ideas:

Avoid the whole situation in the first place. When designing your surveys, consider how students are likely to respond to a given question. If your gut (trust your gut!) tells you almost every student will respond 4 or 5, raise the bar. Look for easy ways to bring down your scores. For example, replace words like familiar with, aware of, etc. with confident, command of, etc. If your question is quantitative or involves a frequency, up the numbers — for example, weekly not monthly.

Continue avoiding the situation in the first place! That is, avoid leading questions. If I’m asked a true or false question “Do you know that Michigan has the winningest softball coach in the country?” Well, I don’t think I knew, until you asked the question, and I don’t want to be wrong, so…. TRUE!
Lump your responses. Focus on the number of students to select “agree” or
“strongly.” For example, “90% of students agreed or strongly agreed with statement x.” By lumping scores together, you blur some of the noise that’s created with so many response options, and the data is simpler to look at. You can also check out the negative side. Let’s say you want to know if students are adjusting socially? You might want to see how many students disagree or strongly disagreed with the statement “I have made friends on campus.”

Break down your demographics. If you have a large enough sample size, try looking at how the men responded in comparison to the women, or how the history majors responded versus how the engineers responded. While I don’t recommend breaking down the data just for the sake of breaking it down — unless you have bundles of time on your hands — this might yield insights you otherwise would have missed.

Tax on higher ed endowments? Higher ed funding, you’re going the wrong way!

I collected data! Now what?!

Abby photoWe’re coming to the close of yet another academic year and you did it! You surveyed students or tracked who did (and didn’t!) visit your office or understood the student learning outcomes from a program or whatever we keep preaching about on this blog. But, now what???? If you read any assessment book, at this point there are common next steps that include things like “post-test” and “close the loop” and a bunch of other common (and good!) assessment wisdom. But sometimes that common assessment wisdom isn’t actually helping any of us professionals DO something with all this data. Here are a few things I do with my data before I do something with my data:

  1. Share the data in a staff meeting: Your colleagues may or may not be formally involved in the specific program you assessed but they work with the same students, so they’ll be able to make connections within student learning and to other programs/services that you’re missing. Ask them about the themes they’re seeing (or not seeing!) within the data. It’ll help you clarify the outcomes of your data, bring more people into the assessment efforts in your office (more heads are better than one!), and it’s a nice professional development exercise for the whole team. Teamwork makes the dream work!
  2. Talk to peer colleagues about their version of the same data: Take your data* to a conference, set up a phone date with a colleague at a peer school, or read other schools’ websites. Yes, you’ll likely run into several situations that aren’t directly applicable to yours, but listen for the bits that can inspire action within your own context.
  3. Take your data to the campus experts: Know anyone in Institutional Research? Or the head of a curriculum committee? Or others in these types of roles? These types of people work with the assessment process quite a bit. Perhaps take them to coffee, make a new friend, and get their take.
  4. Show your data* to student staff in your office: Your student staff understand the inner workings of your office AND the student experience, so they’re a perfect cross section of the perspective that will breathe life into the patterns in your data. What do they see? What data patterns would their peers find interesting? What does it mean to them?

WOW, can you tell I’m an extrovert?! All of my steps include talking. Hopefully these ideas will help you to not only see the stories of student learning and programmatic impact in your data, but also to make the connections needed to progress toward closing the loop.

* This goes without saying, but a reminder is always good; make sure to autonomize the data you show students and those outside of your office/school!

Assessment in a Year: A Sincere Thank You

2015-03-14_OhNoLogo22-abby3 - headshot of AbbyOn No is a year old this month. Can you believe it?! What a year it’s been! I remember how it all started, like it was yesterday. Mark and I were up at the crack of dawn (which is our norm) during one of the few times a year that we’re actually in the same state, sitting in his kitchen discussing the idea of generating a conversation about assessment in student affairs and higher education. We simply didn’t know who (if anyone!) would want to participate. We thought, Well if nothing else we’ll just keep doing what we’re doing (which is either talking to each other about assessment or boring our respective significant others with our assessment talk), it’d merely be on a blog instead of via Google Hangout. But you all and our amazing guest bloggers made this year meaningful and brought in much needed topics we had been missing. Wow. THANK YOU all sincerely for reading, liking, reposting/sharing/retweeting, and sharing with us how this thing we love/hate called assessment looks in your world.

blogiversary ee card: "Happy blogiversary! Here's to another five years of beginning every day wondering if an event is blog-worthy or not. It always is."I’m excited about our future at Oh No. Here’s what it looks like so far:

  1. More guests: You all love reading our guest bloggers (and so do Mark and I). We will have as many as we can get! If you want to write about any assessment aspect (or maybe you know someone who you wish would write a post), PLEASE let us know!
  2. More time for simmering (read: less posts): I cannot count the number of people who tell me that they like the blog but they can’t keep up with the volume. WHAT?! You mean constantly talking about assessment 3x/week EVERY week doesn’t endlessly interest you????? Of course not, Abby and Mark! Duh! It can be overwhelming on your limited time (for us too!), so we’ve already been paring it down considerably.
  3. A new look is coming: GET EXCITED!!!! I know we are!!! Oh No is getting a facelift, so stay tuned!

Again, we cannot thank you all enough for reading and engaging in this assessment conversation with us. It’s so much more fun with you! Cheers to another year!

One Whole Year of Assessment

2015-03-14_OhNoLogo22-mark3The blog turned one year old last week. Our first post went out on April Fools Day. I spent a week convincing my friends that I really was co-starting a blog. No really. It was not a joke. Yes, we probably could have come up with a better launch date, but who doesn’t love April Fools Day? Should I be capitalizing April Fools Day? Who knows. Wait. Yes, it’s a proper noun. Nevermind that last question.

Some people say having a dog is like having a baby. I think having a blog is like having a baby. Perhaps you didn’t come up with the idea, but once the wheels were in motion, it was hard to turn around. After a while, all my friends knew I had a blog. Most of them probably didn’t want me to remind them weekly with a facebook post, but heck, those posts are not going to read themselves. I think that’s where the metaphor ends.

When we started this thing, we had no idea what it would turn into. I didn’t even know what I wanted it to turn into. Of course, I wanted it to be read. To get other people thinking about assessment. To start a conversation about assessment. Deep down, part of me just wanted it to exist as a place I could go to write things out. Much like a journal (but, at times, significantly less interesting), the typing helps me organize the cluster of assessment thoughts passing through my mind.

<<<TIMEOUT: This is starting to feel like a blog-ending post. It is not. I repeat, not, the last post. This is merely a “one year in thoughts” post. Proceed. >>>

Generating content is tough. We went from 3 posts per week, to a short summer break, to 2 posts per week, and I think we’ve now settled in a sturdy (yet approximate) 1 post per week. It was easy to come up with ideas at first, but as we’ve worked through them, I find myself starting with single sentence assessment-related statements, and wondering “can that be stretched into a post?”

Marketing a blog is equally as difficult. Why have a blog if I don’t want people to read it? If I don’t try to publicize the blog, it feels like I’m not even trying. So, onward we go, collecting a few followers at a time. On the upside, wordpress gives us site statistics, and it seems folks are in fact reading this — enough folks, in fact, that they cannot all be family members.

Assessment, man/woman/you. Sometimes I wonder if assessment is inherently boring. While I’m always getting better at it, and I’m often interested in the results, discussing assessment in a universal way is difficult. I find myself often thinking “will anyone care to read this?” But here you are, paragraph 6 and still reading.

Onward we go. I will continue to generate content as long as I feel it ads value (to myself or others). I will continue to spend time staring at a blinking cursor trying to get started. I will try to make assessment sexy. I will not continue putting off blog entries. How do I know this? I’m almost out of new Mad Men episodes to watch.

Keep on assessin’!

Assessment of Academic Probation

2015-03-14_OhNoLogo22-mark3As our office continues to develop, we’re starting to implement smaller assessment pieces to pair with larger programs. Two years ago, we created an assessment of our academic advising; probably a good place for an advising office to start. Since then, we’ve initiated assessments for our Peer Advising program (student staff who support our office) and our Peer Mentor program (a mentorship program we offer the students we serve).

This week, we discussed the addition of a new assessment for our probation program. We’ve established a structured process for our students not performing well academically but do not yet have a means of evaluating this effort. I thought I’d share some of my thoughts on this assessment; as disconnected and undeveloped as they may be.

This is a difficult group to assess. For one thing, we put a lot of work into developing relationships with our students. I don’t want to jeopardize that relationship with an end of term You failed your classes. Here, take this survey and become a row on a spreadsheet survey. These students need to feel like their advisor knows them. That’s an assumption, but I’d like to hear form those who disagree. I can’t help but feel that collecting data from them directly treats them like mail on a conveyor belt.

Beyond that, we make a lot of assumptions on what’s “good” or “right” for this group. For example, that they’re attending tutoring or meeting with a counselor. It’s quite possible for a student to completely follow the plan we lay out in September, and find themselves struggling again. If we decide Steve should be using a planner, and he uses the planner dutifully all semester and struggles, does this mean our process is working? Most can agree that a successful GPA is an important outcome, but if we look solely at GPA, we might miss a lot of relevant progress a student is making. Would then a lighter/easier class schedule skew the assessment — making it look like a student was making more progress than they really did?

What are we assessing here? Clearly, GPA is important; but can’t a student make progress personally without doing well academically? In such case, should our assessment conclude that our process is not succeeding?

Just who are we assessing here? How much impact can we make on a student? Though we’re charged with supporting the student, we’re not the ones taking the exams. Should our assessment measure a student’s progress or how well they adhere to our support program? If the former, it seems we’re measuring a group’s natural talents — that is, a good group might skew the data to make it look like our process is working. If the latter, we’re assuming that we know what’s right for the student and perhaps ignoring what’s really important. Yes, that was vague on purpose.

The question to statement ratio in this post is a bit out of control, I apologize for that. I’ll keep thinking and perhaps put together a post with some practical thoughts rather than the broad ones I pose above.

Keep on keeping on and if you have any thoughts, please reply.

Guest Blogger: Accessible Survey Instruments

Picture of blog author, Chanelle White

You’ve probably thought a lot about WHAT you ask students when you are assessing learning outcomes. Perhaps you’ve never considered HOW you ask those questions? I’m talking specifically about universal design or the concept that making things accessible benefits everyone. If you have an assessment that is
sent to students, it should be accessible for all. Many offices use surveys ranging from home-grown Microsoft Word documents to sophisticated tools like Formstack.

Perhaps your college or university is currently re-evaluating all of the surveys you use. Perhaps you’re frantically googling universal design (#tbt to me googling “snappy casual” last month in panic mode). Most likely you’re somewhere in between. Here are some considerations as you develop your survey instrument.


I can identify 3 barriers: education (or lack thereof), fear, and money.


Perhaps your staff isn’t educated in the need for accessible design, Americans with Disabilities or Rehabilitation Acts, or civil rights (but seriously). Research it yourself, take a class, develop a relationship with a campus partner, and challenge yourself to be an advocate for students with disabilities!


Listen, I’m not an expert in accessible survey instruments nor am I a computer programmer. I’m just a person who wants to help others and break down barriers in this complex educational system. Sometimes you need to cast fear aside: jump in head first, get your hands dirty, and learn something new.


When in doubt, money is always the problem, right? But, you may be thinking, I don’t have any money to make these compliant. Or, my school won’t pay for a new accessible tool. Actually, you can make accessible Word documents. Seriously, click File then Check for Issues and finally Check Accessibility. As long as you’re not doing complicated data visualization, it’s pretty easy. [Disclaimer: If you are doing some fancy things, you may want to get an expert to look at your documents. If we’re talking about a 10 question survey, I think you can figure it out. May the force be with you.]

Assessment Design

Use person-first language.

Bad: A disabled student asked me a question.

Good: Jennifer asked me a question.

This sounds silly, right? Jennifer is a person too and she doesn’t need to be defined by her mobility, disability, wheelchair, etc.

Use headings and be descriptive!

They help students who use screen readers to quickly navigate the page.

Bad: Section I

Good: Personal Information

Use descriptive links.

When trying to find information in an email, website, or Word document, students who use screen readers may tab from link to link to quickly find information. If you bury the link to your survey in an email, students may give up and not take the survey!

Bad: Click here to take my survey. [Not a real survey]

Bad: Take my survey here, [Definitely not a real survey]

Good: Please take the Oh No It’s an Assessment Blog 2016 Survey. [Still not a real survey]

Use alternative text.

Does your survey include a bunch of images? Data visualization is awesome, but we need to make sure those images are conveyed to users with screen-readers. Adding alt text is often as simple as editing the picture in the format mode and adding an alt text or right-clicking on the image to type a description of the image.

Keep it clean.

Bad: Attention students: Take my really awesome survey to win $50!

Good: High contrast text (dark blue or black on white), easy to read fonts, and no flashing buttons or images, etc.


Screen Readers

NVDA (free) or JAWS (costly) are two screen-reader tools. These are text-to-speech screen readers that many students who are visually impaired use. The biggest disadvantage to using these tools are that they require a lot of practice to become comfortable using them. If you personally are not familiar with these tools, perhaps you could strike up a conversation with someone in IT services, web development, or disability resources? You might learn something and have a new colleague across campus.

Accessibility Checkers


Free web accessibility checker. Easy to use because you simply paste your URL in the search box and BOOM- results.

Total Validator

“Total Validator is an HTML validator, an accessibility validator, a spell checker, and a broken links checker all rolled into one tool, allowing one-click validation of your website” (stolen directly from the Total Validator website). You must download a desktop application. They have a free version and a pro version.

I hope this brief overview gives you somewhere to begin checking accessibility (a11y for short) in your survey instruments. This post was not sponsored, but if you want to sponsor it, please contact my attorney (since I didn’t win the #powerball).

 Chanelle White is the Senior Academic Advisor in the Farmer School of Business at Miami University. Please leave your comments below or feel free to contact Chanelle via email at

Focus Groups: How Do We Get Students to Participate?

2015-03-14_OhNoLogo22-abby3I’m back conducting another focus group. You may remember that I did a focus group earlier last year to get feedback and ideas for the design of our online learning goals tool (“Career Tracks”). The student voice was influential to Career Tracks’ design: it now has a whole “Planning” component on which students can put career development tasks, services, and programs on a calendar and add their own deadline, due to the great feedback we received from students. This is likely not surprising to you that student input positively shaped a college initiative; but this acts as a good reminder of the power that one student voice can contribute in the creation of effective, student-centered initiatives.

But my question lately is, how the heck do I get students to show up and offer their voice??? Recently I’ve been working on a research project focusing on what students learn from their internship about being an employee (a project that could not be done without the power of collaboration!). To collect data we had two focus groups and several one-one-one research interviews. To find participants, I reached out to a number of interns, provided lunch, held it during a time of day in which no classes are offered, and besides RSVPing, there were no additional tasks students had to do to participate (so a very low barrier to entry). Sounds perfect, right? I’m guessing you know better; there is no perfect outreach method to students (but if you’ve figured that out, patent the idea and then become a millionaire – or, better yet, comment below with your revelations!).

I know many of us struggle with student participation in different forms, whether it be getting students to complete surveys, vote in student council elections, attend open forums for on-campus faculty and staff candidates, and other times in which the student view is imperative. But how do we get them to complete the things or show up at the stuff (outside of paying them or tying participation to things like course registration)? And how do we proceed if they don’t?

At the NEEAN fall forum in November, I attended one presentation about a topic related to student participation in surveys/focus groups/etc. A woman in the audience had been herself a participant in a longitudinal research project (over 15 years). She offered up some advice on how to get and keep students engaged with research/data collection-type projects that I will keep with me and share with you:

  • Show the project’s importance in the big picture – Communicate to students how their voice will shape and be an important part of the future of these initiatives for their future peers and colleagues.
  • But also keep it relevant to the present – Share with students how their participation contributes to college initiatives becoming more beneficial to them. Their voice will help make things better/more effective in their time at the college, not just in some nebulous future time.
  • Make it a mutual investment – In the case of a focus group, where you know your participants and they’re sharing much of their time for your project, make the time and effort to remember or attend one of their events. This of course isn’t always applicable (or in cases of confidentiality, appropriate) but if students are giving you their time, give them yours. Send a birthday card, attend their on-campus presentation, go to their orchestra concert, etc. The participant is investing in your project, so invest in theirs.
  • Follow up with the results and check in – Depending on the timeline and scope of your project, (briefly) check in with your student participants on the research’s progress and give them access to the results. Not only does this help with transparency but also keeps students engaged in the process, and, potentially, creates early adopters of the findings.
  • Preach and model ‘pay it forward’ – Whether you’re a student, faculty, or staff member, there will come a time when you will need other people to complete something for you (e.g., a survey, research questionnaire, etc.), so for this and other reasons, we should all probably be thoughtful about doing the same for others. This concept is larger than the bounds of one person’s project, so how do we as a college-wide community communicate this to students?? (Also, there’s got to be a term for this out there already – Data Stewardship? Civic Participation? Academic Responsibility? Survey Karma? – …ideas???)

I’m working on a few of these already, but the “pay it forward in data collection” is a concept I want to keep thinking about. I haven’t hit a millionaire-level idea with it yet but I’ll keep you all updated. You do the same. What have you done to get the student voice?

Guest Blogger: When Assessment and Data Are Too Much of a Good Thing

Shamika Karikari photo headshot

I love coffee. Strong and black. Coffee does something to my soul when I drink it. The smell of roasted coffee beans and steam coming from my favorite mug brings a smile to my face. Beginning my morning with a hot cup of coffee is sure to set a positive tone for the rest of the day. I love coffee. Then something happens. It’s 3 o’clock and I realize I’m on my fourth cup of the day. During this realization, I begin to notice my higher heart rate, funny feeling in my stomach, and that I’m a bit more energized than what is good for me. The truth is I had too much of a good thing.


coffee mug with quote about courage

What is your “too much of a good thing”? I’m convinced we all have it, whether we want to admit it or not. Nowadays it seems assessment and data has become one of higher education’s good things that we have too much of. I want to be clear; assessment and data are necessary in higher education. Both assessment and data are good, just like coffee. However, when we have too much of it and do not use it effectively, this good thing turns into something bad. I see this most often show up in three ways that I will describe below.

  • Quality over quantity. Assess more and have more data has been the message given to many higher education professionals. More isn’t inherently bad, but it also isn’t always necessary. When we expect professionals to assess more, are we equipping them with the tools to build effective assessment tools? Are we being thoughtful about targeting what we assess instead of assessing everything? Do we consider survey fatigue? We must consider these questions. Creating fewer effective assessment tools that provide rich data instead of conceding to the pressure to assess everything will serve professionals well. Switching the focus to quality over quantity is a shift higher education must consider.
  • Dust filled data. When we leave something in a corner and don’t attend to it dust will collect. The same happens with data in higher education. When we conduct multiple assessments we have data that is filled with dust because we do not do anything with it. Because most of our data is stored electronically we don’t see the dust, but it’s there. It’s not enough to say we did an assessment. We must go a step further and use the data! We must analyze the information we’ve collected, share it with folks who need to know, and adapt a plan for how the data will be used. When we do this, our assessment becomes purposeful. When we do this, our investment in that specific assessment is justified. When we do this, our colleagues and students are best served. What timeline can you set yourself to avoid dust getting on your data? What data currently needs dusting off?
  • Over our heads. Some higher education professionals have done a great job assessing in effective ways and utilizing the data collected. However, the dissemination of data is over our heads. The pressure professionals feel has turned into the need to create 30-page analysis of data. What happened to one-page summaries? When will we use technology to disseminate our data? How can we make the analysis and presentation of the data interesting, so people want to read and use it? These are all questions we should be asking when considering the dissemination of data. I have found infographics to be a highly effective way to disseminate information in an accessible way. Making changes to better share our stories is beneficial and necessary.

Assessment is a good thing. Making data driven decisions is a good thing. We know this to be true. To ensure it doesn’t continue as too much of a good thing, professionals must consider the implications of the current way we do assessment in higher education. The survey fatigue students experience, the pressure to have data when making any size decisions, and the expectation that we assess everything under the sun have clouded the goodness of assessment. How are you doing with quality over quantity? What data needs dusted off in your office? How can you make data accessible to all? Considering these questions will get you one-step closer to keeping assessment good. Because remember, like my fourth cup of coffee in the afternoon, you want to steer clear of having too much of a good thing.

Mika Karikari is currently a doctoral student in Student Affairs in Higher Education at Miami University as well as an Associate in Career Services. Additionally, her professional background also includes academic support, residence life, and new student programs. You can follow Mika on Twitter @MikaKarikari or email her at

Sharing Data Effectively

2015-03-14_OhNoLogo22-mark3One of the challenges we assessment-ites have is what data to share and how to share it. When sharing data, you want it to be both interesting and appropriate to the intended audience. For data to have impact, it must be interesting. But not all data should be shared. Because I don’t have a better word for it, I’ll call that the “appropriateness” of the data. If the data is detrimental to your mission, it may not be appropriate to share.

It all starts with your intended audience. Is the audience your director? the dean? students? Once you have the intended audience, it’s helpful to visualize with the table below:

InterestingAppropriateChartI created this chart from the perspective of the students; if your audience is the math department, Dr. A’s calculus class becomes more appropriate. Similarly, if I’m the intended audience, how many bagels am I eating each week? The idea is for all of your reported data to fall in the upper-right quadrant.

And now, more on the appropriateness of the data…. 

At our last advisor meeting in the fall term, we discussed a new tool available to our students. This tool, integrated into the registration system, gives students information about the courses for which they might enroll. The system tells them what past students often took after this class and the degrees they sought. It even shows them the grade distribution of the class over the past few years. I had the requisite student affairs knee jerk reaction: but do we want students to see grade data? Will they then avoid the “hard” classes and lean toward the “easy” ones? I put quotes around “hard” and “easy” because, you know, there are no easy or hard classes — every student’s experience is different.

After learning about the student interface, we were introduced to the staff interface. What we see has MUCH more information. The system allows us to drill down and look at specific groups of students (sophomores, engineering students only, underrepresented groups, etc.). It’s a powerful tool I found myself lost in for about 45 minutes that afternoon. It’s the Netflix of work; once opened, who knows how long you’ll be in there.

My thoughts bounced around like a ping pong ball in a box of mouse traps. From Students should not be able to see this! They’ll simply take the easier courses! To Students should have access to EVERYTHING! They need to learn to make data driven decisions! Then I started to settle down.

It’s good for us to share information with our students — especially information that interests them. They’ll make a ton of decisions in their lifetime and need to navigate the information that’s out there. Sure, some of them will choose the high average GPA classes, but would they have been better served if we stuck to the usual “Nope. I can’t give you any guidance on this. You need to surf the thousand-course course guide and find a class that interests you.“?

But some data shouldn’t be widely available. If you’re trying to increase the number of women in your institution’s math major, it might be counter productive to allow undeclared students to see “Oh, this major is 90% men. I don’t know if that’s a place I want to be.” It seems to me that kind of information sustains the imbalances we already have and are trying to mitigate.

To conclude…

It’s easy to get pulled into the “oh, can we add a question on ______ in the survey?” cycle. If you’re not careful, you end up with an oversized excel spreadsheet and a bored audience. When you feel the survey creep happening, get back to the questions of: Who is this for? Is this interesting to them? Is it appropriate for them?

Now go youtube other videos of ping pong balls and mousetraps.