What's the evidence? Do grammar checkers improve writing?

October 2023

Earnsy Liu looks for readily available evidence to help with the communication decisions we make every day. The evidence doesn't include anything paywalled, so while it may not reflect all of the current research available, it certainly encourages us to keep thinking and putting our readers first.

Do grammar checkers improve writing?

Do grammar checkers help you write better over time or are they one-off quick fixes? After poring over a heap of non-paywalled journal articles, I reckon grammar checkers have their place — but seem to make an alarming number of mistakes.

This article covers English grammar checkers, with ‘grammar checkers’ referring to programs that give feedback on writing. They’re sometimes called computer-assisted language learning programs, or described as giving automated corrective feedback or automatic writing evaluation. The quality of feedback and how it’s given matters. If the feedback isn’t good, can it improve writing?

Examples of grammar checkers are Criterion, Grammarly, Pigai, Microsoft Word, and Virtual Writing Tutor. One literature review covered a whopping 43 programs (Shadiev & Feng, 2023)! The focus of grammar checkers varies greatly. For example, some programs:

  • rate specific aspects of your writing
  • give more general feedback than others
  • summarise your errors and strengths
  • suggest further improvement
  • allow teachers and peers to provide feedback too.

Photo by SHVETS production / Pexels

Be aware of limits in the research

All the research I found was conducted in academic settings, often on participants for whom English wasn’t a first language. There didn’t seem to be any research in business settings and on native English speakers. Some findings will apply to everyone though, so keep reading, whoever you are!

Another thing: I can’t be sure all the findings still apply because software is updated every five minutes these days. Accuracy or user-friendliness may have improved, or both, so I’ve opted not to name the programs. But at least you’ll have some idea of what to look out for.

Grammar checkers do offer benefits

Two researchers who reviewed 36 studies published between 1990 and 2011 concluded that texts produced with grammar checkers were better. They found little evidence, however, of more general improvement in writing (Stevenson & Phakiti, cited in Shadiev & Feng, 2023).

But survey responses in other studies suggest that grammar checkers might help users better understand grammar rules (Cavaleri & Dianati, 2016; O’Neill & Russell, 2019a; O’Neill & Russell, 2019b). Respondents rated how much they agreed with statements like:

  • The grammar feedback developed my language long term (not just for this assignment) as I could understand the grammatical rules more.
  • I believe the grammar feedback will develop [students’] confidence in their language long term (not just for this assignment) as they could better understand the grammatical rules.
  • Grammarly has helped me understand grammar rules.

Does better understanding translate to better writing? I’d love to know.

They give immediate, personalised feedback

‘Typically, grammar checkers work by scanning through a text and providing immediate feedback... If the checker finds an error, it will explain the grammar rule and may also offer a solution which the user can accept or ignore …grammar checkers do not claim to teach grammar; they are a tool to bring potential problems to the writer’s attention.’ (Cavaleri & Dianati, 2016, p.A-225).

Prompt feedback matters. Users are more likely to act when something is fresh in their minds (McGregor, Merchant, & Butler, cited in O’Neill & Russell, 2019b). Academic learning advisers who used Grammarly to give feedback felt students were more likely to make corrections (O’Neill & Russell, 2019a), which other research also observed (Potter & Fuller, cited in O’Neill & Russell, 2019a). Learners often ignore feedback from teachers, partly because that feedback is given much later (Guénette, cited in John & Woll, 2020).

Prompt feedback is also likely to improve learners’ grammar significantly more than delayed feedback (Shintani & Aubrey, cited in Shadiev & Feng, 2023).

They may increase engagement and have other benefits

Grammar checkers also:

  • have been found to increase high school students’ motivation, engagement and confidence in grammar rules, and language ability (Potter & Fuller, cited in Cavaleri & Dianati, 2016)
  • offer autonomy, because users can use them anytime and anywhere (Chen & Cheng, cited in Fan, 2023; O’Neill & Russell, 2019a)
  • enable multiple corrections (Warschauer & Ware, cited in Fan, 2023)
  • save instructors time, freeing them up for other things (John & Woll, 2020).

Inaccuracy is a biggie

Accuracy in grammar checkers is about:

  • how many errors they detect
  • whether those errors are actually errors.

Comparing the accuracy of grammar checkers is tricky. One literature review found that only 11 out of 82 studies reported on accuracy, and they didn’t always calculate it the same way (Shadiev & Feng, 2023). But here’s what we do know.

They miss a lot

Missed errors mean errors don’t get uncorrected, and users will think their writing is better than it is.

Les Perelman, an authority on writing evaluation, put 350 words through seven grammar checkers. He had identified 12 major errors in the text, errors that would jump out to highly skilled speakers of English. But the programs struggled to identify all of them. One found only one error, while two found five errors — the most that a program managed to find (Perelman, 2016).

In one study, learning advisers at an Australian university felt ‘the main issue was that [the program] “missed a lot”’ (O’Neill & Russell, 2019a, p.101). Another study reported that while students were ‘largely positive’ about a certain grammar checker, they had concerns, especially about accuracy (O’Neill & Russell, 2019b). They also had trouble understanding some suggestions, with one person reporting that they ‘hated’ the feedback (O’Neill & Russell, 2019b, p.50).

Researchers found ‘pervasively low’ accuracy when they reviewed five articles published between 2007 and 2009. In the best result, only 80% of the errors detected were indeed errors, which meant at least 20% of errors were false alarms (John & Woll, 2020, p.171).

The Canadian researchers then had three grammar checkers analyse essays written by advanced learners of English. Alarmingly, all three programs missed errors — lots of them (John & Woll, 2020).

Bar graph showing how much grammar checkers missed in student essays. The first grammar checker missed 30% of errors, the second missed 20%, and the third missed 5%.

Figure 1: Getting three grammar checkers to assess essays by advanced learners of English demonstrated that the programs can miss a good chunk of errors. The type of errors the programs detected varied — they weren’t all good at finding one type of error or another (adapted from John & Woll, 2020).

They don’t always suggest the right corrections

Wrong suggestions and false alarms could seriously confuse second language users. If you’ve grown up with a language, your gut may tell you that a grammar checker is wrong. But second language users are ‘more or less at the mercy of the software. Arguably, they are thus more susceptible to being misled, which may result in confusion, frustration and, paradoxically, more error’ (John & Woll, 2020,p.171).

Let’s return to the research above about grammar checkers missing mistakes. The programs sometimes gave the wrong advice for the errors they did find. For example, one knew that ‘seeked’ was wrong, but suggested ‘seeded’ and ‘sleeked’ instead of ‘sought’. Another program realised ‘person’ in a particular context should have been plural but offered ‘personalities’ instead of ‘people’ (John & Woll, 2020).

Bar graph showing the percentage of wrong corrections. Five percent of the first grammar checker’s suggestions were wrong, 13% of the second were wrong, and 28% of the third were wrong.

Figure 2: The grammar checkers sometimes proposed the wrong corrections in student essays. Getting 28% wrong is equivalent to nearly three out of every ten suggestions wrong (adapted from John & Woll, 2020).

They can give too much feedback

Grammar checkers can give too much feedback, which can overwhelm, discourage, and hamper learning (Hewett, and Sommers, cited in Dembsey, 2017; O’Neill, 2019b).

One study found that that a program tended to give far more feedback than people did (Dembsey, 2017). Across three essays, the program made a total of 118 comments, compared to an average of 51 comments from advisers. (Two advisers made more comments on one of the essays than the program did though.) The advisers made fewer comments because they were pressed for time, but they also consciously limited feedback and repetition.

Feedback can be complex (Aluthman, cited in Fan, 2023). It can involve technical terms, like impersonal pronoun, run-on squinting modifier, comma splice, compound predicate, and prepositional phrase (Dembsey, 2017). Sheesh. How many fluent speakers understand what those things mean? Can you imagine how someone who’s not confident in English would feel?

Don’t rely only on grammar checkers if possible

Having someone to discuss feedback with seems really helpful. In one study (O’Neill & Russell, 2019a), learning advisers:

  • cleaned up the program’s feedback by removing inaccurate or unnecessary suggestions
  • filled in gaps that the program missed or hadn’t addressed well.

Many advisers described the program as ‘a “starting point” or “platform” from which to begin a consultation. They did not see it as a definitive solution to students’ grammatical needs.’ This combined approach gave students the best of the human and AI worlds (O’Neill & Russell, 2019a).

References

Cavaleri, M., & Dianati, S. (2016). You want me to check your grammar again? The usefulness of an online grammar checker as perceived by students. Journal of Academic Language and Learning, 10(1), A223-A236. Retrieved from https://journal.aall.org.au/index.php/jall/article/view/393/246

Dembsey, J. M. (2017). Closing the Grammarly® gaps: A study of claims and feedback from an online grammar program. The Writing Center Journal, 36(1), 63-100. Retrieved from https://www.onlinewritingcenters.org/wp-content/uploads/2019/01/2017-Dembsey-WCJ-36.1.pdf

Fan, N. (2023). Exploring the effects of automated written corrective feedback on EFL students’ writing quality: A mixed-methods study. SAGE Open, 13(2), 1-17. Retrieved from https://journals.sagepub.com/doi/pdf/10.1177/21582440231181296

John, P., & Woll, N. (2020). Using grammar checkers in an ESL context: An investigation of automatic corrective feedback. Calico Journal, 37(2), 169-192. Retrieved from https://www.researchgate.net/profile/Paul-John-9/publication/314208464_Using_grammar_checkers_to_provide_written_corrective_feedback/links/61814eaf3c987366c319f6ce/Using-grammar-checkers-to-provide-written-corrective-feedback.pdf

O'Neill, R., & Russell, A. M. T. (2019a). Grammarly: Help or hindrance? Academic learning advisors’ perceptions of an online grammar checker. Journal of Academic Language and Learning, 13(1), A88-A107. Retrieved from https://journal.aall.org.au/index.php/jall/article/view/591

O’Neill, R., & Russell, A. M. T. (2019b). Stop! Grammar time: University students’ perceptions of the automated feedback program Grammarly. Australasian Journal of Educational Technology, 35(1), 42-56. Retrieved from https://ajet.org.au/index.php/AJET/article/download/3795/1514/

Perelman, L. (2016). Grammar checkers do not work. WLN: A journal of writing center scholarship, 40(7-8), 11-20.

Shadiev, R., & Feng, Y. (2023). Using automated corrective feedback tools in language learning: A review study. Interactive Learning Environments, 1-29. Retrieved from https://www.researchgate.net/profile/Yingying-Feng-4/publication/366853238_Using_automated_corrective_feedback_tools_in_language_learning_a_review_study/links/63bfcf2da99551743e609557/Using-automated-corrective-feedback-tools-in-language-learning-a-review-study.pdf