Darsie Bowden’s Investigation into the Role of Instructor Feedback on Students’ Writing Efforts

Students care about feedback and consider comments on their papers much more carefully than instructors often give them credit for.  That’s one of the encouraging discoveries Writing, Rhetoric and Discourse Professor Darsie Bowden and a team of WRD student researchers are finding as they approach the analysis stage in a project examining how students utilize instructor feedback.

Darsie_student

Teachers commonly complain that students ignore their written comments on assignments.  Some have even tried different approaches to commenting, including audio-recording their suggestions, conferencing with students or soliciting feedback—probably impractical on a large scale.  Darsie, who has taught writing to teachers and students for 30 years at three institutions and has administered DePaul’s first-year writing program, is approaching this conundrum from the perspective of the student.

“We’re looking at the wrong thing when we look at the final draft,” she said.  “It’s static.”  Why?  It leaves out what she is finding to be a critical aspect: “students’ reflections on how they got there.”

Over the past two years Darsie has been attempting to identify precisely what transpires from the moment when students confront teacher feedback through the decision-making process that results in their submission of a revised draft.  She and her researchers have been meeting from one to three times a week in the SSRC’s computer lab since last summer where they are using NVivo software to structure and analyze data collected from 47 students—primarily freshmen—from 12 sections of WRD 103, Composition and Rhetoric I, a course on college-level writing standards and expectations required of most first-year DPU students.  The data consists of written drafts, survey information and interviews assembled, transcribed and coded by Darsie, her researchers and staff from The University Center for Writing-based Leadership at DePaul.

The interviews have provoked “wonderful conversations with students,” said Darsie, opening a window on to their thought processes, starting from students’ initial reaction to teacher feedback and progressing from planning to execution and the revising of their papers.  Each project participant gave two interviews—first, upon receiving written comments where students discussed how they felt about the feedback, how they interpreted it and how they planned to address it.  The subsequent interview came after they submitted a revised paper.  That conversation focused on how they’d incorporated their instructor’s comments into both their thinking and their writing, including what influenced their choices and revisions.

Darsie was taken aback to find that 36 of the 47 (77%) subjects in her study expressed confusion over at least one comment from the instructor.  “That’s distressing!” she said.  However, a fuller depiction emerged as students talked through their processes, sharing with interviewers such moments as when a puzzling comment suddenly gelled, be it an idea or the reason behind a grammar rule.

That point might occur far down the road, a finding that could have significant implications for how teachers define writing.  “We may have to conceive of writing in a broader sense,” said Darsie, as a process that unfolds over time, even up to years.  “How are you going to open those doors such that instructors can understand what’s going on and intervene in productive ways that will actually help students become better writers and thinkers?” she asks.

Through the analysis, Darsie hopes to pinpoint what sort of comments help students most and why they follow, ignore or reject suggestions.  Previous investigations of this type have concentrated on student writers from specific types of schools (such as Ivy League or two-year institutions) or on best practice recommendations for teacher feedback.  She wants to scrutinize the thought and writing process of a broad range of students, including strong and unconfident writers.  She thinks her sample “pretty nicely” represents the demographics of first-year DePaul students across colleges, including non-native speakers, students who struggle with learning disabilities, transfer students and high-achievers.

Her project has received financial support from DePaul’s Quality of Instruction Council, the University Research Council and LAS Summer Research Grants as well as a prestigious Research Initiative award from the Conference on College Composition and Communication.  She greatly appreciates the suggestions on how to utilize NVivo coming from her research team—WRD graduate students Bridget Wagner, Meaghan Young-Stephens, Katie Martin and former student Jeff Melichar.  NVivo’s search, query and visualization functions let them match and apply coded data from the writing drafts, interviews and demographic surveys that include GPA and ACT/SAT scores as well as the students’ own assessments of their writing ability to form a sophisticated analysis and a rich picture of project participants.

Darsie plans to summarize her findings in a journal article by the end of summer.  Longer term, she is contemplating a book-length assessment that would expand upon how students handle feedback and what determines strong or weak processing of suggestions.  Her ultimate aim is to provide evidence-based guidance for instructors across disciplines on how to best serve all student writers.

Advertisements

Leave a re/ply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s