The Role of Grammarly as an Automated Writing Evaluation Tool in EFL Writing Learning Process

Authors

  • Kristian Florensio Wijaya Cita Hati School, Samarinda, Indonesia

DOI:

https://doi.org/10.24239/dee.v6i2.127

Keywords:

EFL writing, Automated Writing Evaluation tool, Grammarly, library study, thematic analysis

Abstract

This study investigated the role of Grammarly as an automated writing evaluation (AWE) tool in supporting EFL learners’ writing development. A library-based review was conducted, analyzing findings from 30 empirical studies published between 2018 and 2025. A thematic analysis approach was employed to identify common patterns across the selected literature. The review showed that Grammarly primarily contributed to improvements in writing accuracy, linguistic proficiency, and surface-level mechanics. The tool also supported the development of learner autonomy, strategic revision skills, and motivation when integrated into classroom writing activities. However, the reviewed studies indicated that Grammarly remained limited in addressing higher-order writing skills such as organization, coherence, and idea development. These findings highlighted the importance of combining AWE tools with teacher feedback to achieve balanced writing instruction. The study offered insights relevant to international educators seeking evidence-based guidance on adopting AWE tools to enhance EFL writing pedagogy.

References

Abdalkader, S. M. A. (2022). Using Artificial Intelligence to improve Writing Fluency for The Preparatory Stage Students in Distinguished Governmental Language Schools. Egyptian Journal of Educational Sciences, 2(2), 39-70. https://doi.org/10.21608/ejes.2022.270694

Abd El Rasoul, T. G. A., Aboelwafa, M. A., & Seddeek, A. R. (2023). Exploring the attitude of ESP learners towards using automated writing evaluation to assess their writing. Insights into Language, Culture and Communication Journal, 3(1), 88-105. https://doi.org/10.21622/ilcc.2023.03.1.157

Abdul Rahman, N. A., Zulkornain, L. H., Che Mat, A., & Kustati, M. (2023). Assessing Writing Abilities using AI-Powered Writing Evaluations. Journal of ASIAN Behavioural Studies, 8(24), 1–17. https://doi.org/10.21834/jabs.v8i24.420

Armanda, M. L., Nugraheni, A. F., Wulansari, A., & Imron, A. (2022). “Grammarly” as English Writing Assistant from EFL Students’ Perspective. English Education: Journal of English Teaching and Research, 7(2), 128–137. https://doi.org/10.29407/jetar.v7i2.17988

Ayan, A. D., & Erdemir, N. (2023). EFL Teachers Perceptions of Automated Written Corrective Feedback and Grammarly. NEU Journal, 5(3), 1183–1198. https://doi.org/10.38151/akef.2023.106

Barrot, J. S. (2022). Integrating Technology into ESL/EFL Writing through Grammarly. RELC Journal, 53(3), 764–768. https://doi.org/10.1177/0033688220966632

Braun, V., & Clarke, V. (2022). Conceptual and Design Thinking for Thematic Analysis. Qualitative Psychology Journal, 9(1), 3–26. file:///Users/ksiamisang/Desktop/mark.pdf.

Dewi, U. (2023). Grammarly as Automated Writing Evaluation: Its Effectiveness from EFL Students’ Perceptions. Lingua Cultura Journal, 16(2), 155–161. https://doi.org/10.21512/lc.v16i2.8315

Dizon, G., & Gayed, J. M. (2021). Examining The Impact Of Grammarly On The Quality Of Mobile L2 Writing. JALT CALL Journal, 17(2), 74–92. https://doi.org/10.29140/JALTCALL.V17N2.336

Dizon, G., & Gold, J. (2023). Exploring the effects of Grammarly on EFL students’ foreign language anxiety and learner autonomy. JALT CALL Journal, 19(3), 299–316. https://doi.org/10.29140/jaltcall.v19n3.1049

Dodigovic, M. (2021). Automated Writing Evaluation: The Accuracy of Grammarly’s Feedback on Form. International Journal of TESOL Studies, 3(2), 71–87. https://doi.org/10.46451/ijts.2021.06.06

Fahmi, M. A., & Cahyono, B. Y. (2021). EFL students’ perception on the use of Grammarly and teacher feedback. Journal of English Educators Society, 6(1), 18–25. https://doi.org/10.21070/jees.v6i1.849

Fan, N. (2023). Exploring the Effects of Automated Written Corrective Feedback on EFL Students’ Writing Quality: A Mixed-Methods Study. SAGE Open Journal, 13(2), 1–17. https://doi.org/10.1177/21582440231181296

Fiki Setiawan, & Annas Alkhowarizmi. (2025). Exploring an Artificial Intelligence as Automated Feedback Program in EFL Writing. English Teaching Journal, 16(1), 202–224. https://doi.org/10.26877/eternal.v16i1.1206

Fitria, T. N. (2021). “Grammarly” As a Teachers’ Alternative in Evaluating Non -Efl Students Writings. LEKSEMA: Jurnal Bahasa Dan Sastra, 6(2), 141–152. https://doi.org/10.22515/ljbs.v6i2.3957

Geng, J., & Razali, A. B. (2022). Effectiveness of the Automated Writing Evaluation Program on Improving Undergraduates’ Writing Performance. English Language Teaching, 15(7), 49-60. https://doi.org/10.5539/elt.v15n7p49

Ghufron, M. A., & Rosyida, F. (2018). The Role of Grammarly in Assessing English as a Foreign Language Writing. Lingua Cultura Journal, 12(4), 395-403. https://doi.org/10.21512/lc.v12i4.4582

Guo, Q., Feng, R., & Hua, Y. (2022). How effectively can EFL students use automated written corrective feedback in research writing? Computer Assisted Language Learning Journal, 35(9), 2312–2331. https://doi.org/10.1080/09588221.2021.1879161

Hassanzadeh, M., & Fotoohnejad, S. (2021). Implementing an automated feedback program for a foreign language writing course: A learner-centric study. Journal of Computer Assisted Learning, 37(5), 1494–1507. https://doi.org/10.1111/jcal.12587

Huang, H. L., Hwang, G. J., & Chang, C. Y. (2020). Learning to be a writer: A spherical video-based virtual reality approach to supporting descriptive article writing in high school Chinese courses. British Journal of Educational Technology, 51(4), 1386–1405. https://doi.org/10.1111/bjet.12893

Khoshnevisan, B. (2020). The affordances and constraints of automatic writing evaluation tools: A case for grammarly. Research Gate, 27(4), 12–25. https://www.researchgate.net/

Klassen, A. C., Creswell, J., Plano Clark, V. L., Smith, K. C., & Meissner, H. I. (2012). Best practices in mixed methods for quality of life research. Quality of Life Research Journal, 21(3), 377–380. https://doi.org/10.1007/s11136-012-0122-x

Link, S., Mehrzad, M., & Rahimi, M. (2022). Impact of automated writing evaluation on teacher feedback, student revision, and writing improvement. Computer Assisted Language Learning Journal, 35(4), 605–634. https://doi.org/10.1080/09588221.2020.1743323

Liu, P., Zhang, Y., & Liu, D. (2022). Flow experience in foreign language writing: Its effect on students’ writing process and writing performance. Frontiers in Psychology Journal, 13(8), 1–14. https://doi.org/10.3389/fpsyg.2022.952044

Miranty, D., Widiati, U., Cahyono, B. Y., & Sharif, T. I. S. T. (2023). Automated writing evaluation tools for Indonesian undergraduate English as a foreign language students’ writing. International Journal of Evaluation and Research in Education, 12(3), 1705–1715. https://doi.org/10.11591/ijere.v12i3.24958

Nathir Ghafar, Z. (2024). Assessment of EFL Teachers’ and Students’ Writing Skills by Using the Grammarly Program. Journal of E-Learning Research, 3(1), 1–16. https://doi.org/10.33422/jelr.v3i1.671

Ngo, T. T. N., Chen, H. H. J., & Lai, K. K. W. (2024). The effectiveness of automated writing evaluation in EFL/ESL writing: a three-level meta-analysis. Interactive Learning Environments Journal, 32(2), 727–744. https://doi.org/10.1080/10494820.2022.2096642

Nova, M. (2018). Utilizing Grammarly in Evaluating Academic Writing: a Narrative Research on Efl Students’ Experience. Premise: Journal of English Education, 7(1), 80-96. https://doi.org/10.24127/pj.v7i1.1300

Nowell, L. S., Norris, J. M., White, D. E., & Moules, N. J. (2017). Thematic Analysis: Striving to Meet the Trustworthiness Criteria. International Journal of Qualitative Methods, 16(1), 1–13. https://doi.org/10.1177/1609406917733847

Palermo, C., & Wilson, J. (2020). Implementing automated writing evaluation in different instructional contexts: A mixed-methods study. Journal of Writing Research, 12(1), 63–108. https://doi.org/10.17239/JOWR-2020.12.01.04

Prasetya, R. E., & Raharjo, D. H. (2023). Enhancing English Language Writing Skills: An Evaluation of the Efficacy of Grammarly Application. Journal of English Language Studies, 8(2), 320-338. https://doi.org/10.30870/jels.v8i2.19294

Sanosi, A. B. (2022). the Impact of Automated Written Corrective Feedback on Efl Learners’ Academic Writing Accuracy. Journal of Teaching English for Specific and Academic Purposes, 10(2), 301–317. https://doi.org/10.22190/JTESAP2202301S

Sanosi, A. B., & Mohammed, M. O. M. (2024). The effectiveness of automated writing evaluation: a structural analysis approach. International Journal of Evaluation and Research in Education , 13(2), 1216–1226. https://doi.org/10.11591/ijere.v13i2.25372

Saricaoglu, A., & Bilki, Z. (2021). Voluntary use of automated writing evaluation by content course students. ReCALL Journal, 33(3), 265–277. https://doi.org/10.1017/S0958344021000021

Suteja, S., & Setiawan, D. (2022). Students’ Critical Thinking and Writing Skills in Project-Based Learning. International Journal of Educational Qualitative Quantitative Research, 1(1), 16–22. https://doi.org/10.58418/ijeqqr.v1i1.5

Taj, S., & Khan, M. A. (2024). Comparing Grammarly and Chatgpt for Automated Writing Evaluation of Esl Learners. Bahria University Journal of Humanities and Social Sciences, 7(2), 62–89. https://www.researchgate.net/

Takallou, F. (2025). The Impact of Grammarly AI-powered Writing Tool on the EFL Students’ Writing Proficiency and Autonomy. Technology Assisted Language Education Journal, 3(2), 99–119. http//doi.org/10.22126/tale.2025.11107.1062

Tambunan, A. R. S., Andayani, W., Sari, W. S., & Lubis, F. K. (2022). Investigating EFL students’ linguistic problems using Grammarly as automated writing evaluation feedback. Indonesian Journal of Applied Linguistics, 12(1), 16-27. https://doi.org/10.17509/ijal.v12i1.46428

Thi, N. K., & Nikolov, M. (2022). How Teacher and Grammarly Feedback Complement One Another in Myanmar EFL Students’ Writing. Asia-Pacific Education Researcher Journal, 31(6), 767–779. https://doi.org/10.1007/s40299-021-00625-2

Tran, T. M. L., & Nguyen, T. T. H. (2021). The Impacts of Technology-based Communication on EFL Students’ Writing. Asia CALL Online Journal, 12(5), 54–76. http://eoi.citefactor.org

Wei, P., Wang, X., & Dong, H. (2023). The impact of automated writing evaluation on second language writing skills of Chinese EFL learners: a randomized controlled trial. Frontiers in Psychology Journal, 14(9), 1–11. https://doi.org/10.3389/fpsyg.2023.1249991

Wijayanti, S., Sumarta, & Rahmawati, M. (2021). Teachers’ perception on the evectiveness using Grammarly as a tool for writing assesment. LINGUISTIK : Jurnal Bahasa & Sastra, 6(2), 342–355. DOI: 10.31604/linguistik.v6i2.342-355

Xu, T. S., Zhang, L. J., & Gaffney, J. S. (2022). Examining the Relative Effects of Task Complexity and Cognitive Demands on Students’ Writing in a Second Language. Studies in Second Language Acquisition Journal, 44(2), 483–506. https://doi.org/10.1017/S0272263121000310

Yamashita, T. (2024). Effectiveness and inclusiveness of locally adapted human-delivered synchronous written corrective feedback for English referential articles. Computer Assisted Language Learning Journal, 37(6), 1074-1107. https://doi.org/10.1080/09588221.2022.2068612

Zhai, N., & Ma, X. (2022). Automated writing evaluation (AWE) feedback: a systematic investigation of college students’ acceptance. Computer Assisted Language Learning Journal, 35(9), 2817–2842. https://doi.org/10.1080/09588221.2021.1897019

Downloads

Published

2025-12-03

How to Cite

Wijaya, K. F. (2025). The Role of Grammarly as an Automated Writing Evaluation Tool in EFL Writing Learning Process. Datokarama English Education Journal, 6(2), 61–76. https://doi.org/10.24239/dee.v6i2.127

Issue

Section

Articles