Improving Student Assessment

Volume 24, No. 3 - Spring 2013
Share on Facebook0Tweet about this on TwitterPrint this page
RAISING THE BAR. Lorie Hach and Carla Gerriets participated in an inclusive and collaborative process to improve student writing assessment at Sitting Bull College. Photo by M.K. Laughlin

RAISING THE BAR. Lorie Hach and Carla Gerriets participated in an inclusive and collaborative process to improve student writing assessment at Sitting Bull College. Photo by M.K. Laughlin

In the past, tribal college student assessment has told a troubling story. But the problems haven’t necessarily been student performance or faculty aptitude. Rather, the assessment measures themselves—their designs and methodologies—have been the problem. According to a 2008 article in Tribal College Journal by Maggie George (Diné) and Daniel McLaughlin, “Tribal colleges and universities are often faced with financial constraints and pedagogical challenges that adversely affect assessment outcomes, resulting in a misleading portrait of the institution in question.”For example, at Sitting Bull College (SBC, Fort Yates, ND) writing instructors noted that the final computer adaptive college placement test (ACT Compass) assessment results showed some improvement in the areas of sentence fluency and support, but the overall scores were not necessarily higher than when the students were admitted. However. as Chad Harrison (Standing Rock Lakota) notes, “Assessment may not always reflect the job we are doing.”

When grounded in tribal college missions, assessment can more accurately reflect whether students are progressing. Aided by a five-year Woksape Oyate: Wisdom of the People grant from the Lilly Endowment, SBC started an Academic Excellence Center.

“There’s a more concentrated effort on referring students to the Academic Excellence Center for tutoring, more tutoring hours, more tutors available,” says writing instructor Renee Froelich. While this tutoring seemed helpful, at the end the fourth year, instructors still had lingering questions. These included the following: How can we determine if student writing is improving from course to course? Is there a need for an additional course between English foundations and first-year college writing? How can we improve ACT Compass scores? And how can we work with high school teachers to prepare students better for college-level writing?

To answer these questions, a diverse group of educators “put their minds together to see what they could build for their children,” in the words of Tatanka Iyotake, Sitting Bull. Middleand secondary-school teachers from schools on the Standing Rock Reservation came together with SBC writing instructors and the writing center coordinator, a researcher from North Dakota State University, and an NDSU graduate student. Together, we assessed the writing of all students in writing classes at SBC during the 2011-2012 academic year. Karen (Swisher) Comeau (Standing Rock Dakota) assembled and mentored the group.

For the 2011-2012 academic year, SBC writing instructors agreed to use the same writing prompt for all of their writing classes at the beginning and end of each semester—and to score these essays as a group using a norming process that sought to increase consensus in the scoring of student writing. The group met twice during their free time, on Saturdays, to score pre-essays and then to score post-essays. The group read and discussed the prompt, read and discussed the scoring guide, scored a set of “anchor” essays, and came to consensus on the scores. They then began scoring student essays using the first set as anchor essays to help make scoring decisions.

The average score on the writing assessment was 2.36 out of 6 at the beginning of the semester and 3.26 at the end, demonstrating that student writing improved by almost one point on a six-point scale. We now know for certain that student writing does improve from the beginning of the course to the end. Data also showed that through training, instructors at all levels can improve the accuracy of their scoring. This was especially encouraging for the writing center coordinator, Lorie Hach, who is not an English teacher. At first, she felt intimidated by the prospect of assessing writing. By learning collaboratively how to use the scoring guide, she gained confidence. She felt the approach put everyone “on a level playing field” when it came to assessing student writing. The group’s consensus on scoring the essays increased from 71% to 93% over the semester— an increase of 22%.

What everyone really valued was the process—and coming to consensus through dialogue. Middle and high school teachers and college instructors took the time to talk through what they valued in student writing and their methods for assessing it, and then came to an agreement about how to assess the essays accurately. If one instructor gave an essay a two and another gave it a four, they talked through their differences and came to a consensus. This inclusive and collaborative process led to productive conversations that helped teachers consider the impact their teaching practices have on student success.

Lorie Hach, for example, worked closely with English foundations instructor Carla Gerriets to provide intensive one-on-one work with students in the writing center. Chad Harrison, who teaches first- and second- year writing courses, encouraged students to discuss topics relevant to their day-to-day lives on the reservation while developing the rhetorical skills they need to write papers with strong arguments. One of the high school teachers, Irma Pokorny, went on to try out some of the strategies in her high school classroom. Meanwhile, SBC instructors experimented with making assessment expectations more accessible to their students while providing more models of successful writing. Giving students opportunities to “read, analyze, and emulate models of good writing” is one of the eleven elements of writing instruction that help improve student writing, according to rigorous research done by Steve Graham and Dolores Perin in their 2007 report, Writing Next: Effective Strategies to Improve Writing of Adolescents in Middle and High Schools. (The report can be found on the Alliance for Excellent Education website.)

Revisiting the questions we had at the beginning, we found that student writing clearly is improving from course to course. We also determined that an additional course between the foundations course and the first-year writing course is unnecessary; in fact, our research suggested it might adversely affect student progress. Instead, students can receive additional one-on-one help through the Academic Excellence Center. With respect to how we might improve ACT Compass scores, we decided that our own internal assessment gave more complete and accurate data than those scores, so the internal assessment should be our focus for program evaluation. We also felt our work with high school teachers will help prepare students for college-level writing. We look forward to developing continued partnerships with them and believe these partnerships will support our students’ success.

Collaborative assessment provides professional development to teachers of all levels, and when norming is used to ensure accuracy of scoring, teachers can feel confident that the student gains are real. The discussions surrounding the assessment are as important (or more important) than the assessment itself; teachers can share best practices and solve problems in a supportive environment. Also, because all of the teachers work in the local cultural context, the discussions were holistic and consistent with the mission and vision of the tribal college and community. At the end of the second semester, the average scores of the essays were even higher (3.3) than at the end of the first semester. Teachers thought the students seemed to have stronger content, better critical thinking, more focused introductions, and improved paragraph development.

We recommend collaborative assessment to other tribal colleges seeking to improve student writing. Because the work involved is intensive, however, participants need released time to give their full energy to the task. Furthermore, the assessment itself must be constructed by the instructors, integrated into the class so that it is not “just another test” piled on top of students, and fully analyzed so that results can be used to improve teaching practices.

This article was submitted by Kelly Sassi, but Mary Kateri Laughlin, Renee Froelich, Carla Gerriets, Lorie Hach, Chad Harrison, and Karen Comeau were all co-authors. For more information about the project, email Kelly.Sassi@ndsu.edu.


2017 AIHEC Student Poetry Slam

AIHEC POETRY SLAM 2017

On the opening evening of the 2017 AIHEC Student Conference in Rapid City, students from an array of TCUs entertained conference goers with the spoken word at the annual poetry slam. View the video

Twiniversity:
Life of a Tribal College Mom


CELINA GRAY

I Am an Ancestor’s Dream

Change, especially institutional change, takes time-and instead of just throwing our hands up in the air we should take it slow, each of us has our own roles to play.

. Read more →