At the American Copy Editors Society (ACES) conference in October, we learned about research conducted by Fred Vultee, Ph.D., ACES board member and professor of journalism at Wayne State University. Fred studied the effects of standard copyediting practices on readers’ perceptions of story quality. Specifically, he asked readers to read four edited and four unedited news stories and then rate the stories on scales measuring professionalism, grammar, and organization.
Overall, editing had a modest but significant effect on how readers scored the articles’ quality. Differences emerged between cultural subgroups (for example, on certain comparisons, older readers were more critical than younger ones), and between factors including amount of time spent reading news and the source of that news (print vs. online).
Now the study is under consideration for academic publication. I caught up with Fred to learn more about his work and what it means for the editing craft.
What prompted this research?
In 2010, I got an email from ACES president Teresa Schmedding, who was wondering how to approach the idea of measuring editing’s effects. The idea resembled some of the work I had done in my doctoral program, so I started collecting material that reflected today’s editing environment. I decided to examine holistic ways of looking at editing, rather than features you could isolate at the sentence or word level; not just “Is this spelled wrong?” but “Does this sound like police scanner jargon?”
I was interested in addressing the question of “I know we’re investing time in editing, so are we getting something in return for that time?” Whether it’s increased attention, whether it’s increased intent to purchase, whether it’s recognition that we’re adding value to a product, I wanted to help add to the body of evidence supporting that.
What are some of the most important lessons from your work?
I think the most important take-away is that editing doesn’t make everything better for everybody, but it makes most things a little bit better for somebody. For example, some readers noticed a difference in writing, but that wouldn’t move the needle on whether they’d pay for it. But the one thing that consistently increased readers’ perceptions of value was the feeling that stories were written professionally.
I think a side lesson is that not everyone sees organization the same way. That’s one important thing to remember: we’re not editing for one, stable audience anymore; we’re editing for a moving audience.
What does that mean for us as editors?
A few things. I think we’re going to need to be sensitive to the idea that there are probably two or more best ways to organize some stories. That applies especially to web writing; the ways readers are used to seeing content presented have totally changed. I don’t think it’s that online editors don’t notice or don’t care about print standards; I think they’re tapping into a different part of what readers appreciate.
At the same time, I think the markers of good writing will remain, which are that names are spelled correctly throughout; that there aren’t sentence fragments or loose strings of words–very traditional prescriptive things that we would always clean up when editing.
And as things continue to change, I think communicators everywhere need realize that we’re all in the same boat. We help each other when we realize that skills like online publishing are common to all of us, and everybody’s going to need to practice them sometime.
How can we use findings like yours to show others why our job matters?
One thing I tell my editing students is that you don’t want to have to say to people, “Do this because I say so,” or “Do this because it’s magic.” So I think it’s really important for us as editors, who practice this kind of invisible craft, to have evidence to back up our work. If we can show writers the continuous thread from what they do, to what we do, to what the audience gets—then I think we’re in a better position to say that our work is part of a finished product.
Think of it this way: Before a car goes for a test drive, someone has to make sure the lug nuts are tight. If the wheels fall off, the customer’s going to come back mad at somebody. We’re the people who, whether you see them or not, are making sure customers don’t come back mad.
But what about those areas where editing didn’t seem to have an effect? Can we still claim that it’s worthwhile?
I think it’s also helpful, when we talk about the value that our craft adds, to admit that there are some areas where we don’t know if editing makes a difference—rather than have someone point that out for us. If we’re transparent about what we know works, and what we’re not sure about, it’s easier for us to claim that editing is effective enough, in enough cases, to make it a good investment overall.
What’s next for you?
I’ve gotten some very detailed feedback on these findings, so I’m still incorporating that into a revised version. Next, I’ll be interested in looking at some other things that chip away at the question of what’s important to readers. For example, does the decision to use a stock photo vs. a story-specific photo affect what readers get out of it? I’d also like to study platform differences: do people have lower expectations of what they see on Twitter or Facebook, compared with what they expect to be an official corporate release?
I think we can all learn a lot if we look at the stuff we do by instinct, or because we think we’re supposed to, or because we’re told to pay attention to things like SEO. Are these things really generating value? Are they doing what we expect?
How is this research changing the way ACES thinks about editing?
ACES was originally a bunch of old newspaper copyeditors who wrote headlines and had a great time. But we’re a very different organization now. We’ve got a lot of people to appeal to, and I think the more we can remind everyone that what serves editors in newsrooms is good for editors everywhere, the more we can underline that diversity, I think we’ll all gain from that.
Click here for a PowerPoint summary of Fred’s research, presented at the ACES national conference in March 2012.