You have /5 articles left.
Sign up for a free account or log in.

Close, cropped photo of a notebook open on top of a laptop keyboard.

Scholar Dan Melzer says college students are skilled at evaluating their own and their peers’ writing, if given the proper guidance.

Kuntalee Rangnoi/iStock/Getty Images Plus

Dan Melzer, associate director of first-year composition at the University of California, Davis, shook up the research on teaching college-level writing in 2014 with his book Assignments Across the Curriculum: A National Study of Student Writing (Utah State University Press). The decade-long analysis of thousands of college and university writing prompts from different fields revealed many of these assignments were lacking in creativity and scope. Further, in Melzer’s view, they were “obsessed” with grammatical correctness over critical thinking.

Now Melzer is shaking things up again, urging college instructors across disciplines to reconsider how they assess student writing—and who’s assessing it. Melzer’s new book, Reconstructing Response to Student Writing: A National Study From Across the Curriculum (Utah State University Press), argues that students are actually more effective in responding to peers’ writing than are faculty members. That is, students tend to adhere to the best practices of writing response more than professors do, offering praise as well as specific, content-level feedback over generic, sentence-level critiques. Melzer also finds that students are skilled in self-assessment. So what is the role of the faculty member in this landscape? Melzer says students offer their best feedback when they’re guided by professors to do so.

Abstract art-based cover of Dan Melzer’s new book, ’Reconstructing Response to Student Writing’

Dan Melzer's new book, Reconstructing Response to Student Writing: A National Study From Across the Curriculum.

Utah State University Press

Melzer chatted with Inside Higher Ed about his findings and how they ultimately relate to student success. Read on for his insights.

Q: Your new book, Reconstructing Response to Student Writing, is a based on a national study of professor, peer and self-assessments of writing at the college and university level. Can you talk a bit about the study design and what you were looking to understand?

A: I analyzed over 1,000 pieces of student writing with teacher or peer response from 70 institutions of higher education across the U.S. I located these responses from electronic portfolios of student writing, so I was also able to collect examples of student self-assessment, in the form of portfolio reflection letters.

To analyze the responses, I created a heuristic that captures the different aspects of responding to student writing that are discussed in the research literature: what should response focus on, when should we respond, what modalities should we use when we respond and so on.

My goal was to get a large-scale view of how college teachers across disciplines respond to student writing, how students respond to each other when they’re asked to engage in peer response and how well students are able to assess their own writing.

Q: What was your big finding?

A: My most important finding is that students actually respond more effectively than teachers. Based on the best practices in response that I summarized in my review of the literature, students were more likely to give a useful response than teachers. For example, student comments were more specific and less generic than teacher comments, and students did less sentence-level editing than teachers and focused more on content. Students also offered more praise comments than teachers.

Related to this finding is that students were also skilled at self-assessment. When teachers asked students to assess their own writing in a “processes memo” to be included with a draft turned in for teacher response, many times the teacher’s comments were just confirming the students’ own self-assessments.

It seemed the key to the success of the peer response and the self-assessment in the courses in my study is that teachers did a good job of scaffolding. Teachers provided thoughtful scripts to guide peer response and they asked students to engage in peer response with each major writing assignment. Teachers also provided plenty of guidance for student self-assessment and asked students to self-assess frequently—for example, in process memos included with drafts, in midterm reflections and in a final cover letter included with a portfolio.

Q: What are the implications of your findings for students and for professors across disciplines who are teaching them to write?

A: My research adds to prior research that shows teachers could spend less of their labor responding to student writing and more time designing peer response and student self-assessment. I think teacher response is still important, because there were many examples in my study where teachers were providing useful feedback that peers had missed or expert knowledge about the conventions of a disciplinary genre (a lab report, for example). But when teachers do respond, I suggest joining in conversation with peer response and with the students’ own assessment of their writing rather than falling into the trap of just the teacher “correcting” the student.

My research adds to prior research that shows teachers could spend less of their labor responding to student writing and more time designing peer response and student self-assessment.

For example, when you collect drafts from students to give them feedback, ask them to include a processes memo summarizing the feedback they received from peers and their own self-assessment of the strengths and weaknesses of the draft. My research indicates that teachers will likely be pleasantly surprised by the quality of feedback from peers and by students’ own ability to assess their writing.

It’s also worth noting that when students engage in self-assessment, they make greater gains in their writing and are more likely to transfer what they are learning about writing to future writing tasks.

Q: How does this research build on your previous research on teaching writing? Has it made you rethink your past work in any way?

A: In my previous book, Assignments Across the Curriculum, I conducted a national study of writing assignments, and one of my findings was that even though many teachers claim to focus on critical thinking in their writing assignment prompts, in their rubrics they can sometimes be obsessed with grammar and syntax and often grade harshly based on sentence-level errors. What I found in Reconstructing Response was that students have learned from their experiences with college teachers’ writing assessments to be highly focused on grades, and in their process memos and portfolio reflection letters, many students talk about trying hard to please the teacher and trying to play it safe in their writing in order to avoid being marked down for grammatical errors. This has made me want to de-emphasize grading in my own courses and focus more on self-assessment.

I now put more of the weight of a students’ final grade on an extensive self-assessment they do as their culminating piece of writing. They use the evidence of a portfolio of their writing for the semester to argue the extent to which they’ve met the learning outcomes for the course, and they reflect on their own growth and challenges as writers. I would encourage teachers to assign more self-reflective writing, and to do less grading by using alternatives to traditional grading such as contract grading, or ungrading, or portfolio assessment.

Q: What’s the link between your study and student success? Why does getting writing feedback right ultimately matter?

A: We know from prior research that students value feedback and want to apply it to their future writing in the course and beyond. We also know from prior research that when students are asked to give peers feedback or to assess their own writing, they make greater gains as writers because they become more self-aware. But we also know that teacher feedback students see as harsh, or vague, or rubber-stamped isn’t helpful, and feedback that mostly justifies a final grade is not that useful to students in terms of applying comments to future writing.

If teachers put more of the feedback onus on students through peer response and student self-assessment, students will actually experience faster growth as college writers. And when teachers do give feedback, responding to a draft in progress rather than focusing feedback on a final, graded draft is a much smarter use of time and will have a greater positive impact on student success.

Do you have a unique or timely approach to teaching writing across disciplines at your institution? If so, we’d love to hear about it. Share with us here.

Next Story

More from Academic Life