Assessment of Academic Probation

2015-03-14_OhNoLogo22-mark3As our office continues to develop, we’re starting to implement smaller assessment pieces to pair with larger programs. Two years ago, we created an assessment of our academic advising; probably a good place for an advising office to start. Since then, we’ve initiated assessments for our Peer Advising program (student staff who support our office) and our Peer Mentor program (a mentorship program we offer the students we serve).

This week, we discussed the addition of a new assessment for our probation program. We’ve established a structured process for our students not performing well academically but do not yet have a means of evaluating this effort. I thought I’d share some of my thoughts on this assessment; as disconnected and undeveloped as they may be.

This is a difficult group to assess. For one thing, we put a lot of work into developing relationships with our students. I don’t want to jeopardize that relationship with an end of term You failed your classes. Here, take this survey and become a row on a spreadsheet survey. These students need to feel like their advisor knows them. That’s an assumption, but I’d like to hear form those who disagree. I can’t help but feel that collecting data from them directly treats them like mail on a conveyor belt.

Beyond that, we make a lot of assumptions on what’s “good” or “right” for this group. For example, that they’re attending tutoring or meeting with a counselor. It’s quite possible for a student to completely follow the plan we lay out in September, and find themselves struggling again. If we decide Steve should be using a planner, and he uses the planner dutifully all semester and struggles, does this mean our process is working? Most can agree that a successful GPA is an important outcome, but if we look solely at GPA, we might miss a lot of relevant progress a student is making. Would then a lighter/easier class schedule skew the assessment — making it look like a student was making more progress than they really did?

What are we assessing here? Clearly, GPA is important; but can’t a student make progress personally without doing well academically? In such case, should our assessment conclude that our process is not succeeding?

Just who are we assessing here? How much impact can we make on a student? Though we’re charged with supporting the student, we’re not the ones taking the exams. Should our assessment measure a student’s progress or how well they adhere to our support program? If the former, it seems we’re measuring a group’s natural talents — that is, a good group might skew the data to make it look like our process is working. If the latter, we’re assuming that we know what’s right for the student and perhaps ignoring what’s really important. Yes, that was vague on purpose.

The question to statement ratio in this post is a bit out of control, I apologize for that. I’ll keep thinking and perhaps put together a post with some practical thoughts rather than the broad ones I pose above.

Keep on keeping on and if you have any thoughts, please reply.

Guest Blogger: Accessible Survey Instruments

Picture of blog author, Chanelle White

You’ve probably thought a lot about WHAT you ask students when you are assessing learning outcomes. Perhaps you’ve never considered HOW you ask those questions? I’m talking specifically about universal design or the concept that making things accessible benefits everyone. If you have an assessment that is
sent to students, it should be accessible for all. Many offices use surveys ranging from home-grown Microsoft Word documents to sophisticated tools like Formstack.

Perhaps your college or university is currently re-evaluating all of the surveys you use. Perhaps you’re frantically googling universal design (#tbt to me googling “snappy casual” last month in panic mode). Most likely you’re somewhere in between. Here are some considerations as you develop your survey instrument.


I can identify 3 barriers: education (or lack thereof), fear, and money.


Perhaps your staff isn’t educated in the need for accessible design, Americans with Disabilities or Rehabilitation Acts, or civil rights (but seriously). Research it yourself, take a class, develop a relationship with a campus partner, and challenge yourself to be an advocate for students with disabilities!


Listen, I’m not an expert in accessible survey instruments nor am I a computer programmer. I’m just a person who wants to help others and break down barriers in this complex educational system. Sometimes you need to cast fear aside: jump in head first, get your hands dirty, and learn something new.


When in doubt, money is always the problem, right? But, you may be thinking, I don’t have any money to make these compliant. Or, my school won’t pay for a new accessible tool. Actually, you can make accessible Word documents. Seriously, click File then Check for Issues and finally Check Accessibility. As long as you’re not doing complicated data visualization, it’s pretty easy. [Disclaimer: If you are doing some fancy things, you may want to get an expert to look at your documents. If we’re talking about a 10 question survey, I think you can figure it out. May the force be with you.]

Assessment Design

Use person-first language.

Bad: A disabled student asked me a question.

Good: Jennifer asked me a question.

This sounds silly, right? Jennifer is a person too and she doesn’t need to be defined by her mobility, disability, wheelchair, etc.

Use headings and be descriptive!

They help students who use screen readers to quickly navigate the page.

Bad: Section I

Good: Personal Information

Use descriptive links.

When trying to find information in an email, website, or Word document, students who use screen readers may tab from link to link to quickly find information. If you bury the link to your survey in an email, students may give up and not take the survey!

Bad: Click here to take my survey. [Not a real survey]

Bad: Take my survey here, [Definitely not a real survey]

Good: Please take the Oh No It’s an Assessment Blog 2016 Survey. [Still not a real survey]

Use alternative text.

Does your survey include a bunch of images? Data visualization is awesome, but we need to make sure those images are conveyed to users with screen-readers. Adding alt text is often as simple as editing the picture in the format mode and adding an alt text or right-clicking on the image to type a description of the image.

Keep it clean.

Bad: Attention students: Take my really awesome survey to win $50!

Good: High contrast text (dark blue or black on white), easy to read fonts, and no flashing buttons or images, etc.


Screen Readers

NVDA (free) or JAWS (costly) are two screen-reader tools. These are text-to-speech screen readers that many students who are visually impaired use. The biggest disadvantage to using these tools are that they require a lot of practice to become comfortable using them. If you personally are not familiar with these tools, perhaps you could strike up a conversation with someone in IT services, web development, or disability resources? You might learn something and have a new colleague across campus.

Accessibility Checkers


Free web accessibility checker. Easy to use because you simply paste your URL in the search box and BOOM- results.

Total Validator

“Total Validator is an HTML validator, an accessibility validator, a spell checker, and a broken links checker all rolled into one tool, allowing one-click validation of your website” (stolen directly from the Total Validator website). You must download a desktop application. They have a free version and a pro version.

I hope this brief overview gives you somewhere to begin checking accessibility (a11y for short) in your survey instruments. This post was not sponsored, but if you want to sponsor it, please contact my attorney (since I didn’t win the #powerball).

 Chanelle White is the Senior Academic Advisor in the Farmer School of Business at Miami University. Please leave your comments below or feel free to contact Chanelle via email at

Focus Groups: How Do We Get Students to Participate?

2015-03-14_OhNoLogo22-abby3I’m back conducting another focus group. You may remember that I did a focus group earlier last year to get feedback and ideas for the design of our online learning goals tool (“Career Tracks”). The student voice was influential to Career Tracks’ design: it now has a whole “Planning” component on which students can put career development tasks, services, and programs on a calendar and add their own deadline, due to the great feedback we received from students. This is likely not surprising to you that student input positively shaped a college initiative; but this acts as a good reminder of the power that one student voice can contribute in the creation of effective, student-centered initiatives.

But my question lately is, how the heck do I get students to show up and offer their voice??? Recently I’ve been working on a research project focusing on what students learn from their internship about being an employee (a project that could not be done without the power of collaboration!). To collect data we had two focus groups and several one-one-one research interviews. To find participants, I reached out to a number of interns, provided lunch, held it during a time of day in which no classes are offered, and besides RSVPing, there were no additional tasks students had to do to participate (so a very low barrier to entry). Sounds perfect, right? I’m guessing you know better; there is no perfect outreach method to students (but if you’ve figured that out, patent the idea and then become a millionaire – or, better yet, comment below with your revelations!).

I know many of us struggle with student participation in different forms, whether it be getting students to complete surveys, vote in student council elections, attend open forums for on-campus faculty and staff candidates, and other times in which the student view is imperative. But how do we get them to complete the things or show up at the stuff (outside of paying them or tying participation to things like course registration)? And how do we proceed if they don’t?

At the NEEAN fall forum in November, I attended one presentation about a topic related to student participation in surveys/focus groups/etc. A woman in the audience had been herself a participant in a longitudinal research project (over 15 years). She offered up some advice on how to get and keep students engaged with research/data collection-type projects that I will keep with me and share with you:

  • Show the project’s importance in the big picture – Communicate to students how their voice will shape and be an important part of the future of these initiatives for their future peers and colleagues.
  • But also keep it relevant to the present – Share with students how their participation contributes to college initiatives becoming more beneficial to them. Their voice will help make things better/more effective in their time at the college, not just in some nebulous future time.
  • Make it a mutual investment – In the case of a focus group, where you know your participants and they’re sharing much of their time for your project, make the time and effort to remember or attend one of their events. This of course isn’t always applicable (or in cases of confidentiality, appropriate) but if students are giving you their time, give them yours. Send a birthday card, attend their on-campus presentation, go to their orchestra concert, etc. The participant is investing in your project, so invest in theirs.
  • Follow up with the results and check in – Depending on the timeline and scope of your project, (briefly) check in with your student participants on the research’s progress and give them access to the results. Not only does this help with transparency but also keeps students engaged in the process, and, potentially, creates early adopters of the findings.
  • Preach and model ‘pay it forward’ – Whether you’re a student, faculty, or staff member, there will come a time when you will need other people to complete something for you (e.g., a survey, research questionnaire, etc.), so for this and other reasons, we should all probably be thoughtful about doing the same for others. This concept is larger than the bounds of one person’s project, so how do we as a college-wide community communicate this to students?? (Also, there’s got to be a term for this out there already – Data Stewardship? Civic Participation? Academic Responsibility? Survey Karma? – …ideas???)

I’m working on a few of these already, but the “pay it forward in data collection” is a concept I want to keep thinking about. I haven’t hit a millionaire-level idea with it yet but I’ll keep you all updated. You do the same. What have you done to get the student voice?