A Study on the Implementation of Automated Writing Evaluation
DOI:
https://doi.org/10.58213/ell.v4i2.54Keywords:
automated writing evaluation, writing instruction, writing assessmentAbstract
The body of research demonstrating the significance of automated writing evaluation (AWE) systems in writing instruction and education continues to expand. However, not much research has been done to investigate how AWE may be implemented in different educational settings and what kind of effects it has on the students' ability to write. This article describes the MI Write AWE system and the conclusions of an inquiry that looked at the incorporation and use of AWE with middle school writing teaching utilizing a variety of research methodologies. During this investigation, AWE integration was investigated concerning a conventional process approach to writing education and a strategy teaching method based on the paradigm of self-regulated strategy development. Both of these pedagogical tenets were considered about one another. Both the effectiveness of these two instructional settings in fostering students' and teachers' experiences with and perspectives on teaching and learning through the use of AWE, as well as the effectiveness of these instructional settings in encouraging students to improve the quality of their writing from their first draft through subsequent essays, were both evaluated. The results of these evaluations can be found in the table below. Following an eight-week intervention, multilevel model analyses showed that students' first-draft writing skills increased at approximately the same rates independent of the instructional setting. This improvement occurred across the duration of the intervention. The findings of qualitative analyses of interview data demonstrated that AWE's effects on teaching were consistent across various contexts. Both instructional environments featured qualities consistent with a framework for purposeful practice, and this was especially true when it came to the application of AWE.
References
Bereiter, C., & Scardamalia, M. (1987). The psychology of written composition. Hillsdale, NJ: Erlbaum.
Berninger, V. W., & Swanson, H. L. (1994). Modifying Hayes and Flower’s model of skilled writing to explain beginning and developing writing. In E. Butterfield (Ed.), Children’s writing: Toward a process theory of the development of skilled writing (pp. 57–81). Greenwich, CT: JAI Press.
Brindle, M., Graham, S., Harris, K. R., & Hebert, M. (2016). Third and fourth grade teacher’s classroom practices in writing: A national survey. Reading and Writing, 29(5), 929–954. https://doi.org/10.1007/s11145-015-9604-x
Charlton, C., Rasbash, J., Browne, W. J., Healy, M. and Cameron, B. (2017) MLwiN Version 3.02.
Centre for Multilevel Modelling, University of Bristol.
Clare, L., Valdés, R., & Patthey-Chavez, G. G. (2000). Learning to write in urban elementary and middle schools: An investigation of teachers’ written feedback on student compositions (Center for the Study of Evaluation Technical Report No. 526). Los Angeles: University of California, Center for Research on Evaluation, Standards, and Student Testing (CRESST). https://doi.org/10.4135/9781412950558.n70
Clark, V. L. P., & Creswell, J. W. (2008). The mixed methods reader. Thousand Oaks, CA: Sage
Publications.
Corbin, J. & Strauss, A. (2015). Basics of qualitative research: Techniques and procedures for developing grounded theory (4th ed.). Thousand Oaks, CA: Sage. https://doi.org/ 10.4135/9781452230153
Cotos, E., Huffman, S., & Link, S. (2020). Understanding graduate writers’ interaction with and impact of the Research Writing Tutor during revision. Journal of Writing Research 12(1), 187-232. https://doi.org/10.17239/jowr-2020.12.01.07
Creswell, J. (2015). A concise introduction to mixed methods research. Thousand Oaks, CA: Sage Publications.
Dikli, S. (2010). The nature of automated essay scoring feedback. CALICO Journal, 28(1), 99–134.
https://doi.org/10.11139/cj.28.1.99-134
Ericsson, K. A. (2006). The influence of experience and deliberate practice on the development of superior expert performance. In K. A. Ericsson, N. Charness, R. R. Hoffman, & P. J.
Feltovich (Eds.), The Cambridge handbook of expertise and expert performance (pp. 39– 68). New York: Cambridge University Press. https://doi.org/10.1017/cbo9780511816796.038
Festas, I., Oliveira, A. L., Rebelo, J. A., Damião, M. H., Harris, K., & Graham, S. (2015). Professional development in self-regulated strategy development: Effects on the writing performance of eighth grade Portuguese students. Contemporary Educational Psychology, 40, 17–27. https://doi.org/10.1016/j.cedpsych.2014.05.004
Flower, L., & Hayes, J. R. (1980). The dynamics of composing: Making plans and juggling constraints. In L. Gregg and E. Steinberg (Eds.), Cognitive processes in writing (pp. 31–50). Hillsdale, NJ: Erlbaum
Franzke, M., Kintsch, E., Caccamise, D., Johnson, N., and Dooley, S. (2005). Summary Street®: Computer support for comprehension and writing. Journal of Educational Computing Research, 33, 53–80. https://doi.org/10.2190/DH8F-QJWM-J457-FQVB
Gilbert, J., & Graham, S. (2010). Teaching writing to elementary students in Grades 4–6: A national survey. Elementary School Journal, 110, 494–518. https://doi.org/10.1086/651193 Graham, S. (2006). Strategy instruction and the teaching of writing: A meta-analysis. In C.
MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of writing research (pp. 187–207). New York: Guilford. https://doi.org/10.1007/s11145-008-9121-2
Graham, S. (2018). A writer(s) within community model of writing. In C. Bazerman, V. Berninger, D. Brandt, S. Graham, J. Langer, S. Murphy, P. Matsuda, D. Rowe, & M. Schleppegrell,(Eds.), The lifespan development of writing (pp. 272-325). Urbana, IL: National Council of English.
Graham, S., & Harris, K. R. (2003). Students with learning disabilities and the process of writing: A meta-analysis of SRSD studies. In L. Swanson, K. Harris, & S. Graham (Eds.), Handbook of learning disabilities (pp. 323–344). New York: Guilford Press.
Graham, S., & Harris, K. R. (2018). An examination of the design principles underlying a Self- Regulated Strategy Development study. Journal of Writing Research, 10(2), 139–187. https://doi.org/10.17239/jowr-2018.10.02.02
Graham, S., Harris, K. R., Fink-Chorzempa, B., & MacArthur, C. (2003). Primary grade teachers’ instructional adaptations for struggling writers: A national survey. Journal of Educational Psychology, 95(2), 279–292. https://doi.org/10.1037/0022-0663.95.2.279
Graham, S., Harris, K. R., Fishman, E., Houston, J., Wijekumar, K., Lei, P. W., & Ray, A. B. (2019). Writing skills, knowledge, motivation, and strategic behavior predict students’ persuasive writing performance in the context of robust writing instruction. The Elementary School Journal, 119(3), 487-510. https://doi.org/10.1086/701720
Graham, S., Harris, K. R., & McKeown, D. (2013). The writing of students with LD and a meta- analysis of SRSD writing intervention studies: Redux. In L. Swanson, K. R. Harris, & S. Graham (Eds.). Handbook of learning disabilities (pp. 405–438). (2nd ed.). NY: Guilford Press. https://doi.org/10.1111/ldrp.12004
Graham, S., Hebert, M., & Harris, K. R. (2015). Formative assessment and writing: A meta- analysis. The Elementary School Journal, 115, 523–547. https://doi.org/10.1086/681947
Graham, S., McKeown, D., Kiuhara, S., & Harris, K. R. (2012). A meta-analysis of writing instruction for students in the elementary grades. Journal of Educational Psychology, 104(4), 879–896. https://doi.org/10.1037/a0029185
Graham, S., & Perin, D. (2007). A meta-analysis of writing instruction for adolescent students.
Journal of Educational Psychology, 99, 445–476. https://doi.org/10.1037/0022-0663.99.3.445 Grimes, D., & Warschauer, M. (2010). Utility in a fallible tool: A multi-site case study of
automated writing evaluation. Journal of Technology, Learning, and Assessment, 8(6), 4–43.
Harris, K., & Graham, S. (1988). Self-instructional strategy training: Improving writing skills among educationally handicapped students. Teaching Exceptional Children, 20, 35–37. https://doi.org/10.1177/004005998802000211
Harris, K. R., & Graham, S. (2016). Self-regulated strategy development in writing: Policy implications of an evidence-based practice. Policy Insights from the Behavioral and Brain Sciences, 3, 77–84. https://doi.org/10.1177/2372732215624216
Harris, K. R., Graham, S., Brindle, M., & Sandmel, K. (2009). Metacognition and students’ writing. In D. Hacker, J. Dunlosky, & A. Graesser (Eds.), Handbook of metacognition in education (pp. 131-153). Mahwah, NJ: Lawrence Erlbaum.
Harris, K. R., Graham, S., & Mason, L. H. (2006). Improving the writing, knowledge, and motivation of struggling young writers: Effects of self-regulated strategy development with
and without peer support. American Educational Research Journal, 43(2), 295–340. https://doi.org/10.3102/00028312043002295
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77,
–112. https://doi.org/10.3102/003465430298487
Hayes, J. R. (1996). A new framework for understanding cognition and affect in writing. In C. M. Levy & S. Ransdell (Eds.) The science of writing: Theories, methods, individual differences, and applications (pp. 1–27). Mahwah, NJ: Erlbaum.
Hayes, J. R. (2012). Modeling and remodeling writing. Written Communication, 29(3), 369–388. https://doi.org/10.1177/0741088312451260
Kellogg, R. T. (2008). Training writing skills: A cognitive developmental perspective. Journal of Writing Research, 1, 1–26. https://doi.org/10.1080/00461520903213600
Kellogg, R. T., & Whiteford, A. P. (2009). Training advanced writing skills: The case for deliberate practice. Educational Psychologist, 44(4), 250–266. https://doi.org/10.1080/00461520903213600 Kellogg, R. T., Whiteford, A. P., & Quinlan, T. (2010). Does automated feedback help students to write? Journal of Educational Computing Research, 42, 173–196.
https://doi.org/10.2190/EC.42.2.c
Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119(2), 254–284. https://doi.org/10.1037/0033-2909.119.2.254
Knight, S., Shibani, A., Abel, S., Gibson, Ryan, P., Sutton, N., Wight, R., Lucas, S., Sándor, Á., Kitto, K., Liu, M., Vijay Mogarkar, R., & Buckingham-Shum, S.J. (2020). AcaWriter: A learning analytics tool for formative feedback on academic writing. Journal of Writing Research 12(1), 141-186. https://doi.org/10.17239/jowr-2020.12.01.06
Lepper, M. R., & Hodell, M. (1989). Intrinsic motivation in the classroom. In C. Ames & R. Ames (Eds.), Research on motivation in education (Vol. 3, pp. 73–105). San Diego, CA: Academic Press.
Matsumura, L. C., Patthey-Chavez, G. G., Valdés, R., & Garnier, H. (2002). Teacher feedback, writing assignment quality, and third-grade students’ revision in lower-and higher- achieving urban schools. The Elementary School Journal, 103, 3–25.
https://doi.org/10.1086/499713
Mayfield, E., Adamson, D., Woods, B., Miel, S., Butler, S., & Crivelli, J. (2018). Beyond automated essay scoring: Forecasting and improving outcomes in middle and high school writing. Proceedings of the 8th International Conference on Learning Analytics & Knowledge, 1–8. Retrieved from https://help.turnitin.com/Resources/RA%20Curriculum%20Resources/ Research/Revision%20Assistant%20Efficacy%20LAK%202018.pdf
McCutchen, D. (1988). “Functional automaticity” in children’s writing: A problem of metacognitive control. Written Communication, 5(3), 306–324. https://doi.org/10.1177/ 0741088388005003003
McKeown, D., Brindle, M., Harris, K. R., Graham, S., Collins, A. A., & Brown, M. (2016). Illuminating growth and struggles using mixed methods: Practice-based professional development and coaching for differentiating SRSD instruction in writing. Reading and Writing, 29(6), 1105–1140. https://doi.org/10.1007/s11145-016-9627-y
McKeown, D., FitzPatrick, E., Brown, M., Brindle, M., Owens, J., & Hendrick, R. (2018). Urban teachers’ implementation of SRSD for persuasive writing following practice-based professional development: Positive effects mediated by compromised fidelity. Reading and Writing, 1–24. https://doi.org/10.1007/s11145-018-9864-3
Morgan, J., Shermis, M. D., Van Deventer, L., & Vander Ark, T. (2013) Automated Student Assessment Prize: Phase 1 & Phase 2. Retrieved from http://gettingsmart.com/wpcontent
/uploads/2013/02/ASAP-Case-Study-FINAL.pdf
Morphy, P., & Graham, S. (2012). Word processing programs and weaker writers/readers: A meta-analysis of research findings. Reading and Writing, 25(3), 641–678. https://doi.org/ 10.1007/s11145-010-9292-5
Nelson, M. M., & Schunn, C. D. (2009). The nature of feedback: How different types of peer feedback affect writing performance. Instructional Science, 37(4), 375–401.
https://doi.org/10.1007/s11251-008-9053-x
Palermo, C., & Thomson, M. M. (2018). Teacher implementation of Self-Regulated Strategy Development with an automated writing evaluation system: Effects on the argumentative writing performance of middle school students. Contemporary Educational Psychology, 54, 255–270. https://doi.org/10.1016/j.cedpsych.2018.07.002
Parr, J. M., & Timperley, H. S. (2010). Feedback to writing, assessment for teaching and learning and student progress. Assessing Writing, 15, 68–85. https://doi.org/10.1016/j.asw.2010.05.004
Patchan, M. M., Schunn, C. D., & Correnti, R. J. (2016). The nature of feedback: How peer feedback features affect students’ implementation rate and quality of revisions. Journal of Educational Psychology, 108(8), 1098–1120. https://doi.org/10.1037/edu0000103
Patthey-Chavez, G. G., Matsumura, L. C., & Valdés, R. (2004). Investigating the process approach to writing instruction in urban middle schools. Journal of Adolescent & Adult Literacy, 462–476.
Ranalli, J. (2018). Automated written corrective feedback: How well can students make use of it? Computer Assisted Language Learning, 31, 653–674. https://doi.org/10.1080/09588221. 2018.1428994
Roscoe, R. D., Allen, L. K., Johnson, A. C., & McNamara, D. S. (2018). Automated writing instruction and feedback: Instructional mode, attitudes, and revising. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 2089–2093. https://doi.org/10.1177/ 1541931218621471
Roscoe, R. D., Allen, L. K., & McNamara, D. S. (2019). Contrasting writing practice formats in a writing strategy tutoring system. Journal of Educational Computing Research, 57, 723–754. https://doi.org/10.1177/0735633118763429
Roscoe, R. D., Allen, L. K., Weston, J. L., Crossley, S. A., & McNamara, D. S. (2014). The Writing Pal intelligent tutoring system: Usability testing and development. Computers & Composition, 34, 39–59. https://doi.org/10.1016/j.compcom.2014.09.002
Roscoe, R. D., & McNamara, D. S. (2013). Writing Pal: Feasibility of an intelligent writing strategy tutor in the high school classroom. Journal of Educational Psychology, 105, 1010–1025. https://doi.org/10.1037/a0032340
Roscoe R. D., Snow E. L., McNamara D. S. (2013). Feedback and revising in an intelligent tutoring system for writing strategies. In H. C. Lane, K. Yacef, J. Mostow, & P. Pavlik (Eds.). Artificial Intelligence in Education. AIED 2013. Lecture Notes in Computer Science, vol 7926. Berlin, Heidelberg: Springer. https://doi.org/10.1007/978-3-642-39112-5_27
Scardamalia, M., Bereiter, C., & Goleman, H. (1982). The role of production factors in writing ability. In M. Nystrand (Ed.), What writers know: The language, process, and structure of written discourse (pp. 175–210). San Diego, CA: Academic Press. https://doi.org/ 10.1017/s0047404500010095
Shermis, M. D., Garvan, C. W., & Diao, Y. (2008, March). The impact of automated essay scoring on writing outcomes. Paper presented at the annual meeting of the National Council on Measurement in Education, New York, NY.
Shermis, M. D., & Hamner, B. (2013). Contrasting state-of-the-art automated scoring of essays. In M. D. Shermis, & J. Burstein (Eds.), Handbook of automated essay evaluation: Current applications and new directions (pp. 313–346). New York, NY: Routledge. https://doi.org/ 10.4324/9780203122761.ch19
Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78, 153–189.
https://doi.org/10.3102/0034654307313795
Strauss, A., & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory. Thousand Oaks, CA: Sage. https://doi.org/10.4135/ 9781452230153
Stevenson, M. (2016). A critical interpretative synthesis: The integration of automated writing evaluation into classroom writing instruction. Computers and Composition, 42, 1–16. https://doi.org/10.1016/j.compcom.2016.05.001
Stevenson, M., & Phakiti, A. (2014). The effects of computer-generated feedback on the quality of writing. Assessing Writing, 19, 51–65. https://doi.org/10.1016/j.asw.2013.11.007
Vandermeulen, N., Leijten, M., & Van Waes, L. (2020). Reporting writing process feedback in the classroom: Using keystroke logging data to reflect on writing processes. Journal of Writing Research 12(1), 109-140. https://doi.org/10.17239/jowr-2020.12.01.05
Wade-Stein, D., & Kintsch, E. (2004). Summary Street: Interactive computer support for writing.
Cognition and Instruction, 22, 333–362. https://doi.org/10.1207/s1532690xci2203_3 Warschauer, M., & Grimes, D. (2008). Automated writing assessment in the classroom.
Pedagogies: An International Journal, 3, 22–36. https://doi.org/10.1080/15544800701771580
Wilson, J. (2017). Associated effects of automated essay evaluation software on growth in writing quality for students with and without disabilities. Reading and Writing, 30, 691-718. https://doi.org/10.1007/s11145-016-9695-z
Wilson, J., & Czik, A. (2016). Automated essay evaluation software in English language arts classrooms: Effects on teacher feedback, student motivation, and writing quality. Computers and Education, 100, 94-109. https://doi.org/10.1016/j.compedu.2016.05.004
Wilson, J., Olinghouse N. G., & Andrada, G. N. (2014). Does automated feedback improve writing quality? Learning Disabilities: A Contemporary Journal, 12, 93-118.
Wilson, J., & Roscoe, R. D. (2020). Automated writing evaluation and feedback: Multiple metrics of efficacy. Journal of Educational Computing Research, 58, 87-125. https://doi.org/10.1177/ 0735633119830764