Karmen Pižorn and Melita Lemut Bajec Systematic review of digital writing assistants in EFL writing instruction Abstract: Digital writing assistants that deliver Automated Written Corrective Feedback (AWCF) have emerged as valuable tools for reducing teachers’ workload by lessening the time and effort re- quired to provide feedback. When implemented effectively, these tools can enhance students’ writing quality, foster a more holistic understanding of the writing process, and enable learners to revise and correct their work independently. This systematic literature review, conducted in accordance with the PRISMA 2020 protocol, synthesises research on the role of digital writing assistants in providing AWCF . It reviews 22 peer-reviewed English-language papers published between 2018 and 2023, which examine these tools in the context of developing EFL students’ writing skills, in comparison with traditional teacher-generated feedback or in hybrid approaches combining both. The participants in the studies were undergraduates majoring in EFL, along with graduates from other disciplines. The writing tasks focused mainly on essay writing but also included paragraph and research paper writing. The findings highlight that students particularly value the immediacy and accuracy of feedback, which typically addresses areas such as grammar, sentence structure, idea development, word choice, style, cohesion, and coherence. Users receive error corrections and suggested improvements through various methods, including markers, highlighting, meta-linguistic comments, direct suggestions, and inter- active lessons. Although occasional false positives occur, the tools are generally perceived as effective in improving students’ writing performance. Importantly, the results support hybrid approaches and underscore the teacher’s vital role in mediating the process. In sum, AWCF writing assistants should be systematically integrated into classroom instruction as supplementary tools, aligned with learning outcomes and supported by clear guidance on interpreting and applying feedback. Keywords: digital writing assistants, automated written corrective feedback, English as a second/ foreign language, writing skills, systematic literature review UDC: 371 https://doi.org/10.63384/sptB5_z796a Scientific article Karmen Pižorn, PhD, full professor, University of Ljubljana, Faculty of Education, Kardeljeva ploščad 16, SI-1000 Ljubljana, Slovenia; e-mail: karmen.pizorn@pef.uni-lj.si; Melita Lemut Bajec, PhD, assistant professor, University of Ljubljana, Faculty of Education, Kardelje- va ploščad 16, SI-1000 Ljubljana, Slovenia; e-mail: melita.lemutbajec@pef.uni-lj.si; Pižorn, Lemut Bajec Let./Vol. 76 (142) Issue 3/2025 Str. 141–161 142 Sodobna pedagogika/Journal of Contemporary Educational Studies Introduction Assessment is a vital component of the educational process as it significantly enhances the quality of teaching and learning by providing essential information that enables educators and learners to make informed decisions about subsequent teaching and learning steps (Nitko and Brookhart 2018; Readman and Allen 2013). Effective assessment practices help highlight learners’ strengths, identify areas for improvement, guide curriculum development, and improve educational outcomes (Ding 2024). Well-structured rubrics and grading systems play a crucial role in ensuring fairness and accuracy, and clear criteria help students better un- derstand expectations, support self-regulated learning, and strengthen the credi- bility and transparency of the evaluation process (Reddy and Andrade 2010). With advancements in pedagogical theories, assessment practices have shifted towards student-centred approaches and active learning strategies, aligning with the con- structivist view that knowledge is constructed through interaction (Clark 2012). Consequently, standardised, one-size-fits-all methods are increasingly replaced by more personalised, interactive approaches tailored to individual learners’ needs (Shute and Rahimi 2017; Stiggins 2002). This paradigm allows students to set goals, monitor progress, deepen their understanding, and cultivate lifelong learn- ing skills (Ding 2024). Assessments are of many forms: formal and informal, formative and sum- mative, performance-based, diagnostic, placement, proficiency, and aptitude as- sessments (Winna and Sabarun 2023). This review focuses on formative assess- ments, which are ongoing evaluations designed to provide feedback that helps instructors and learners make instructional adjustments to achieve learning ob- jectives (Bitchener and Ferris 2012; Black and Wiliam 1998, 2009). Within this framework, written corrective feedback (WCF) plays a central role. In second and foreign language (S/FL) learning, WCF refers to teachers’ responses to learners’ errors, traditionally referred to in pedagogy as error correction (Li 2018). The perception, significance, and relevance of error correction have evolved over time: while behaviourist perspectives in the 1950s–60s treated errors as signs of inade- quate knowledge, by the late 1980s, errors were increasingly recognised, through Pižorn, Lemut Bajec 143 the lens of language acquisition theory , as an essential part of the learning process (Nagode et al. 2014). Several typologies of WCF have been proposed. Hendrickson (1978), a pio- neer in this field, raised critical questions about the necessity of correcting learn- ers’ errors, the types of errors to focus on, the timing and methods of correction, and the person responsible for correction. Lyster and Ranta (1997) proposed an error-treatment sequence that encourages learner engagement. This sequence in- cludes recasts (e.g. reformulating a verb ending to indicate the correct tense), rep- etition (highlighting errors through repetition), clarification requests, elicitation (guiding questions for self-correction), explicit correction (identifying the error and providing the correct form), and metalinguistic clues (offering comments on language structure and usage). Ellis (2009) further advanced WCF research by introducing a two-dimen- sional model: (1) strategies for providing corrective feedback (CF) and (2) student responses. The first dimension identifies six strategies: direct CF (teacher correc- tion), indirect CF (indicating the presence and location of errors), metalinguistic CF (error codes or brief grammatical descriptions), focused versus unfocused CF (targeting specific errors versus a broad range), electronic feedback, and reformu- lation (rewriting the text while preserving meaning). The second dimension con- cerns learner responses, which may involve revisions or, conversely , no revisions if students only review or receive corrections passively. Adding further complexity, Pižorn (2014) proposed a comprehensive 12-part model of WCF , comprising contextual factors such as educational context, teach- er’s knowledge of L2 learning theories and approach to teaching, learners’ L2 pro- ficiency and age, considerations related to the zone of proximal development, indi- vidualised treatment of learners, teacher’s attitude to errors, pedagogical choices such as direct vs. indirect CF , focused vs. unfocused CF , inclusion vs. exclusion of metalinguistic explanations, and additional dimensions such as self-feedback and peer feedback. This holistic approach frames WCF as part of a process-oriented model of writing development in a foreign language. Such typologies and recommendations for study designs have contributed to the increasing number of methodologically rigorous studies investigating WCF’s potential benefits for improving the writing abilities of S/FL learners. Although some scholars remain sceptical, specifically works of Truscott (1996, 1999, 2004, 2007, 2010) and Truscott and Hsu (2008), arguing that WCF contributes minimal- ly or not at all to learning and the development of writing accuracy , recent research (Bitchener and Ferris 2012; Bitchener 2008, 2012a, 2012b; Bitchener and Knoch 2008, 2009a, 2009b, 2010; Ferris 1999, 2003, 2004; Hattie and Timperley 2007) indicates that WCF , overall, helps learners not only in reducing errors during the manuscript revision process but also in improving their long-term writing abil- ities. Robust research design in the effectiveness of WCF is indeed difficult due to the numerous variables involved. These variables range from feedback-related factors (e.g., type, timing, explicitness, intensity , complexity , mode, and language) to individual learner characteristics (e.g., motivation, learning styles, goals, and developmental needs. They also include situational variables (e.g., teacher influ- Systematic review of digital writing assistants in EFL writing instruction 144 Sodobna pedagogika/Journal of Contemporary Educational Studies ence, the context, and socioeconomic conditions) and methodological variables (e.g., instructional design, teaching methods and content), all of which further complicate research (Evans et al. 2010; Nguyen and Renandya 2023; Wang 2023). As WCF increasingly transitions into digital environments, it is necessary to examine this shift to help users maximise its potential for improving learners’ writing skills. Indeed, digital tools such as Grammarly, Turnitin, and emerging AI-based writing platforms are now commonly integrated into classrooms world- wide, underscoring the growing relevance of automated feedback in real instruc- tional contexts. This systematic literature review, therefore, provides an overview of research regarding the role and effects of automated written corrective feed- back (AWCF) on students’ S/FL writing skills. Specifically, it explores the types of AWCF offered by digital writing assistants in S/FL instruction and compares them with traditional teacher-generated feedback and hybrid models that com- bine teacher guidance with AWCF . The novelty of this research lies in the nature of systematic reviews them- selves: through a structured and comprehensive synthesis of existing studies, sys- tematic reviews offer a high-level understanding of a specific topic. Rather than generating new data, they rigorously collect, evaluate, and integrate findings from various studies to provide a clear and reliable overview . This method helps address inconsistencies across studies, assess the overall quality of the available evidence, and identify gaps in the research that require further exploration. In this way, systematic reviews contribute to academic discourse and evidence-based deci- sion-making in policy and practice (Ferbežar and Štemberger 2023; Turk 2021). Accordingly, this review aims to contribute to a clearer understanding of how AWCF, particularly when combined with teacher mediation, can enhance S/FL writing skills. To this end, we address the following research questions: – RQ1: What types of digital writing assistants exist, what kinds of feedback do they provide, and how are these represented? – RQ2: How does AWCF compare with traditional teacher-generated feedback or hybrid approaches combining teacher input and AWCF? – RQ3: What are learners’ attitudes towards AWCF? Methodology A systematic literature review was conducted to meet the objectives of this study and provide a comprehensive overview of AWCF digital writing assistants and their features. The review followed the Preferred Reporting Items for System- atic Reviews (PRISMA) 2020 guidelines (Page et al. 2021), which ensure transpar- ency, applicability, and scientific rigour. A protocol was established to outline the research questions, information sources, search strategy, selection criteria, data extraction, and subsequent analysis. Pižorn, Lemut Bajec 145 Information sources and search strategy The literature search was conducted using the University of Ljubljana’s Dig- ital Library (DiKUL) portal, which serves as a central index of bibliographic data. It includes both subscription-based and open-access resources from major inter- national publishers and other providers of scientific information across disciplines (DiKUL n.d.). At the time of the search, the DiKUL index covered the databases listed in Appendix A. The Basic Search option of the DiKUL aggregator (based on ProQuest technology) was used to identify potentially relevant documents. This aggregator scans all metadata fields (e.g. title, subtitle, alternate title, author, publication title, issue, abstract, subject terms) and excludes non-contributory fields (e.g. LogoURL, Copyright). The final search was completed in April 2023. Inclusion and exclusion criteria The step 1 review criteria were as follows: – Papers must be peer-reviewed journal articles or literature reviews (exclud- ing grey literature). – The focus must be on digitally enhanced formative assessment or digital feedback aimed at improving writing skills. – Papers must fall within the disciplines of Education and Languages/Litera- ture. – Papers must be published between 2018 and 2023 and be fully accessible through the DiKUL system. – Papers must address either holistic improvement in writing or specific com- ponents (e.g. vocabulary, collocations). – Papers must be written in English and focus on English as a second/foreign language (S/FL). Data collection and analysis Following group discussion and refinement, the final search query was defined as: writing AND feedback AND corrective AND online AND software. This search yielded 600 results, which were exported to Microsoft Excel for screening and removal of records with inadequacies. Nine duplicates were re- moved immediately, leaving 591 records for initial screening. In the first round, titles were reviewed independently by two researchers, who coded each record as: 0 (probably not eligible), 1 (probably eligible), or 2 (un- clear). Records coded 1 or 2 by either reviewer progressed to the next stage. A total of 156 records moved forward. In the second round, abstracts were screened by one researcher to optimise time and resources, with each record coded as 0 (not eligible) or 1 (eligible). This yielded 71 papers for full-text review. Systematic review of digital writing assistants in EFL writing instruction 146 Sodobna pedagogika/Journal of Contemporary Educational Studies Full-text screening excluded 49 papers for the following reasons: 36 did not focus on S/FL writing, 9 focused on unrelated forms of feedback, and 4 did not meet methodological scope. In summary, 22 papers qualified for inclusion in the systematic literature review. The selection process is illustrated in the PRISMA flow diagram (Figure 1). Finally , the GRADE approach was applied to assess the certainty of evidence across the 22 studies and the strength of the resulting conclusions by systemati- cally evaluating five domains: study design, inconsistency, indirectness, impreci- sion, and publication bias. This approach enhances the reliability of recommenda- tions by ensuring a transparent and robust synthesis of findings (Kirmayr et al. 2021; Prasad 2024). However , applying GRADE in social sciences presents challenges: randomised controlled trials are rare in education, studies often rely on small or context-spe- cific samples, and subjective experience and cultural variation play significant roles. These limitations were also present in our sample, which primarily com- prised smaller or context-bound studies, frequently based on self-reported data and generally lacking randomisation. However, the consistency of results across studies was strong. Given this alignment with existing literature on automated feedback, we considered our evidence base sufficiently valid and robust to support our research aims. Figure 1: PRISMA flowchart for the literature review Pižorn, Lemut Bajec 147 A complete list of the included studies (n = 22) is presented in Table 1. Author AWCF Digital Writing Assistants Author AWCF Digital Writing Assistants 1. Grami (2020) AccurIT 13. Qian et al. (2020) iWrite 2. Chen and Pan (2022) Aim Writing 14. Xu and Zhang (2021) Pigai 3. Wang (2020) Pigai, iWrite, Awrite 15. Mohsen and Abdulaziz (2019) MY Access 4. Shang (2019) Cool Sentence Corrective Network 16. Reynolds et al. (2019) PaperRater 5. O’Neill and Russell (2019) 6. Guo et al. (2021) 7. Yousofi (2022) Grammarly 17. Ko (2022) OWS 8. Cheng (2019) 9. Cheng et al. (2022) Automated tracking system 18. Oflaz et al. ( 2022) 19. Waer (2023) Write and Improve 10. Sherafati et al. (2020) 11. Sherafati and Largani (2022) Writing Planet 20. Dodigovic and Tovmasyan (2021) 21. Thi et al. (2022) Grammarly Premium 12. Hassanzah and Fotoohnejad (2021) 22. Han and Sari (2022) Criterion Table 1: AWCF digital writing assistants presented in the papers Results Overview of the analysed studies The corpus analysed consists mainly of quasi-experimental studies, comple- mented by evaluation studies, case studies, and sequential explanatory studies. These studies typically employed mixed-methods research designs, integrating qualitative and quantitative approaches through multiple data collection tech- niques, including various tests and scoring scales, reflective journals, question- naires, focus groups, (semi-structured) interviews, observation protocols, and checklists. This methodological diversity aimed to provide deeper insights into digital writing assistants and the feedback they provide, while also informing their future use. The number of participants across studies ranged from 6 to 322, with most being university students (aged 18 or older) from a variety of disciplines. Two studies also included postgraduate students. In all cases, English was studied as an S/FL, with participants generally at intermediate or higher proficiency levels. They were enrolled in courses focusing on writing skills development through dig- Systematic review of digital writing assistants in EFL writing instruction 148 Sodobna pedagogika/Journal of Contemporary Educational Studies ital writing assistants, working on tasks such as essays, paragraphs, and research papers. The studies were conducted in multiple international contexts: five in China, three each in Iran and Taiwan, two each in Turkey and Saudi Arabia, and one each in Egypt, Armenia, Hungary, and Afghanistan. In total, 14 different digital writing assistants were examined. These can be categorised into three groups: (1) tools developed by educational institutions (n = 3), (2) free web-based systems (n = 4), and (3) commercial subscription-based tools (n = 6). One tool provided both a free basic version and a premium version. The 22 studies may be grouped according to their research focus: – 15 studies compared AWCF with traditional teacher-generated feedback. Of these, nine employed a design with experimental and control groups, while six exposed the same group to both feedback modes (AWCF and teacher feed- back). – 3 studies examined AWCF independently, focusing on variables such as pro- ficiency level, accuracy, effectiveness, and learner attitudes. – 4 studies investigated AWCF in combination with teacher guidance (hybrid mode) compared to traditional teacher feedback. AWCF digital writing assistants This section presents the 14 AWCF digital writing assistants, with a focus on the type and presentation of feedback they provide, based on descriptions in the analysed studies (Table 2). Aim Writing (Microsoft Research Asia, China): This is a free online writing assistant providing instant scores and comments on vocabulary (accuracy, varie- ty, and complexity), sentence patterns (clarity, complexity, and fluency), and dis- course structures (coherence). It supports exams such as CET-4, CET-6, TOEFL, and IELTS. Using machine learning and deep learning, it diagnoses writing is- sues, offers corrections with native-like expressions and error samples, and inte- grates with WeChat. Awrite: This commercial digital writing assistant tool utilises natural lan- guage processing techniques to evaluate grammar, style, mechanics, lexical com- plexity, topic-specific vocabulary, and organisation. It provides holistic and indi- vidualised feedback, using trait scores and metalinguistic references. AccurIT (King Abdulaziz University and Umm Qura University): This is an institutional tool designed to improve English collocations and phrase use. It pre- vents literal translation errors through examples from concordances and databas- es, providing feedback on collocations, idioms, word choice, grammar, punctua- tion, spelling, and composition. Automated Tracking System: This institutional web-based tracking system is designed to assist students in enhancing their writing skills by enabling them to upload their essay drafts and revised versions. The system identifies the type of feedback provided by teachers and the revisions made by students, tracking Pižorn, Lemut Bajec 149 whether these revisions correspond to the feedback received. Its goal is to illus- trate the differences between the feedback given and the changes made, thereby promoting a better understanding and improvement of users’ writing abilities. Criterion®: This is a commercial web-based writing assistant that offers personalised diagnostic feedback on both form (grammar, language use, mechan- ics, style, and organisation) and content (issues related to meaning). It provides instant grading on a scale from one to six. This tool aids learners in planning, writing, and revising their essays by highlighting errors and offering indirect feed- back, which includes brief meta-linguistic explanations, examples, and correction suggestions aimed at enhancing the organisation and development of the user’s work. Grammarly (Premium): This AI-powered, automated writing assistant deliv- ers immediate, direct and indirect feedback on grammatical errors, writing tone, duplicate content, and synonyms to improve the overall quality of the text. The free version identifies 150 error types, addressing grammar, spelling, and punctu- ation, while the premium version covers over 400 error types, along with features like plagiarism detection and vocabulary suggestions. Users can select a specific writing style for tailored feedback. The program is also accessible on smartphones. iWrite: A commercial online writing assistant that provides instant holis- tic scoring and feedback on language (fluency, accuracy, and complexity), content (relevance and coherence), discourse structure (organisation and discourse mark- ers), and mechanics (spelling and punctuation). Final scores are based on either automated assessment or the instructor’s revised evaluations. The programme allows for multiple submissions and time constraints while preventing plagiarism through copy-and-paste functions. Additionally , it offers teachers a variety of writ- ing tasks to engage their students. MY Access: A commercial web-based automated writing assistant that sup- ports learners in developing their writing skills. It employs the IntelliMetric auto- mated scoring system and provides pedagogical activities for teachers to help stu- dents improve content ideas, organisation, and language use. Students can view essays at various ability levels and receive feedback on writing content, style, and organisation. The tool also features support options such as a word bank, feed- back, and scoring. Online Writing System: Developed at the National University of Tainan in Taiwan, this institutional non-commercial software tool enables teachers to provide CF on student errors, focusing primarily on grammatical errors through metalinguistic explanations. Errors are underlined or highlighted to help students recognise and correct their writing mistakes. It also includes general comments on the organisation and structure of the text. Teachers can hide the feedback once students have corrected errors and reactivate it if the same mistakes occur again, promoting autonomous learning. The system provides visual examples for future reference, allowing students to view the feedback and correct errors independent- ly. Systematic review of digital writing assistants in EFL writing instruction 150 Sodobna pedagogika/Journal of Contemporary Educational Studies PaperRater: A free online assistant with plagiarism detection, grammar and spell-checking, readability analysis, and automated scoring (grades 1–100 or A–F). It offers vocabulary and style feedback as well as originality percentages. Pigai: A commercial web-based automated writing assistant tailored for Chi- nese EFL learners. It generates holistic scores and provides general comments on vocabulary, sentence structure, and organisation and detailed feedback on gram- mar, wording, collocations, and mechanics for individual sentences. The tool uses colour-coded highlights for relevant words, clauses, and sentences to enhance lex- ical complexity. Cool Sentence Corrective Network: A free feedback writing assistant for iden- tifying vocabulary , collocation, and general grammar errors and providing instant correction suggestions. It also offers sample sentences and hints to assist students in self-correction. Writing Planet TM : This commercial cloud-based writing assistant is validated against standardised exams such as TOEFL, SAT, and IELTS. It uses automated assessment technology to deliver instant feedback on various aspects, including organisation, conventions, sentence structure, idea development, word choice, and style, along with specific comments and improvement suggestions, as well as links to video lessons. Write and Improve (University of Cambridge): This free automated writing assistant utilises the CEFR scale and is trained on the Cambridge Learner Cor- pus. The software is validated against TOEFL, SAT , and IELTS and is suitable for self-study and classroom use. The system assesses the writer’s proficiency level and provides immediate feedback at the sentence and word levels using a col- our-coded system. Teachers can create workbooks, track student progress, and download class data. Tool Feedback Focus Aim Writing Vocabulary, sentence patterns, discourse structures Awrite: Grammar, style, mechanics, lexical complexity, topic-specific vocabulary, organisation AccurIT Collocations, idioms, word choice, grammar, punctuation, spelling, and composition Automated Tracking System Feedback-tracking and revision analysis Criterion® Grammar, style, organisation, content, language use, mechanics, and content Grammarly (Premium) Grammar, spelling, writing tone, punctuation, vocabulary, plagiarism iWrite: Language, content, discourse structure, and mechanics MY Access Adaptive feedback on writing proficiency Online Writing System Grammar, organisation, and structure PaperRater Grammar, plagiarism, readability, spell-checking, vocabulary, style, and originality Pižorn, Lemut Bajec 151 Pigai Vocabulary, sentence structure, organisation, grammar, wording, collocations, and mechanics Cool Sentence Corrective Network Vocabulary, collocation, and grammar errors Writing Planet TM Organisation, conventions, sentence structure, idea development, word choice, and style Write and Improve CEFR-based feedback at the sentence/word level Table 2: Feedback focus in individual tools Effectiveness of AWCF compared to traditional teacher feedback and hybrid modes The analysis of the studies indicates that AWCF is effective in improving students’ writing skills for several reasons, which will be elaborated upon in the following paragraphs. Individual papers emphasising each discussed aspect will be cited in parentheses. AWCF proves particularly effective in enhancing grammar and vocabulary skills (Chen and Pan 2022; Wang 2020), especially in areas such as collocations and synonyms (Grami 2020). While all students benefit from AWCF , those with lower proficiency exhibit the most substantial progress (Chen and Pan 2022), par- ticularly in terms of accuracy; however, their syntactic complexity does not experi- ence the same level of advancement (Xu and Zhang 2021). This suggests that more proficient students are better equipped to utilise the feedback provided (Guo et al. 2021; Ko 2022; Xu and Zhang 2021; Yousofi 2022). Moreover, AWCF fosters students’ overall learning ability (Wang 2020) and promotes independent learning (Ko 2022; Wang 2020). However, it has been ob- served that students engage with the feedback at moderate to low levels, often pri- oritising form over meaning during their revisions (Thi et al. 2022). While there may be occasional inconsistencies (i.e., when a tool flags similar errors differently or gives contradictory feedback), undetected grammatical or lexical errors, or false positives (i.e., instances where correct language is incorrectly flagged as an error) (Dodigovic and Tovmasyan 2021; Guo et al. 2021), AWCF feedback generally ex- hibits greater accuracy than traditional human feedback (Yousofi 2022). None- theless, some digital writing assistants, such as iWrite, are still underdeveloped and may provide inadequate assessments of L2 writing, with feedback that can sometimes lack proper structure (Qian et al. 2020). Further investigation into the effectiveness of AWCF has explored three modes: AWCF alone, AWCF in combination with teacher guidance (the hybrid mode), and traditional teacher feedback. Results show that students participating in the hybrid mode outperformed those in the other two groups (Han and Sari 2022; Mohsen and Abdulaziz 2019; Waer 2021). Consequently , research advocates for integrating AWCF writing assistants as valuable alternatives and complemen- tary resources within traditional teacher-led classroom settings (Dodigovic and Systematic review of digital writing assistants in EFL writing instruction 152 Sodobna pedagogika/Journal of Contemporary Educational Studies Tovmasyan 2021; Sherafati and Largani 2022; Sherafati et al. 2020). Teacher intervention remains especially crucial for less proficient learners, who require additional support to comprehend the feedback and implement effective self-cor- rections (Ko 2022; Mohsen and Abdulaziz 2019). Users’ attitudes towards AWCF The findings indicate that both students and teachers generally hold positive attitudes towards the use of AWCF, largely due to their favourable experiences with digital writing assistants (Hassanzadeh and Fotoohnejad 2021; Sherafati et al. 2020). They particularly appreciate the efficient, clear, timely, detailed, individ- ualised, and comprehensible feedback, which is grounded in reliable evaluation standards (Grami 2020; Guo et al. 2021; Shang 2019; Wang 2020; Yousofi 2022). This fosters a supportive environment for students, allowing them to revisit their feedback multiple times (Xu and Zhang 2021) and improving their capacity for independent learning (Wang 2020). Additionally, teachers value AWCF writing assistants for the time they save, enabling them to optimise their resources (Sher- afati et al. 2020). Although AWCF is generally viewed positively (Hassanzadeh and Fotooh- nejad 2021; Oflaz et al. 2022), it is crucial to inform students about the feedback source as their expectations may differ between human and computer-generated feedback, ultimately influencing their writing development (Reynolds et al. 2021). Overall, the hybrid model has emerged as the preferred option due to its efficiency (Chen and Pan 2022; Reynolds et al. 2021), with both students and teachers rec- ognising the complementary role of software in the educational process (Sherafati et al. 2020). By contrast, traditional feedback is valued for its personalised and dialogical nature, which has a lasting impact (Chen and Pan 2022; Mohsen and Abdulaziz 2019; Sherafati et al. 2020). Discussion and conclusions Key findings This systematic literature review investigates the role and effect of AWCF on EFL writing skills by comparing various digital writing assistants with tradi- tional and hybrid feedback methods. The findings reveal a diverse array of digital writing assistants available to teachers and students, offering a wide range of features and functionalities that cater to different needs and tasks. In line with previous research, such as Stevenson and Phakiti (2014), who evaluated several commercially available AWCF tools for their accuracy and constructive feedback, this review confirms the potential of such tools to enhance writing instruction. However, Stevenson and Phakiti (2014) also cautioned against risks such as en- couraging surface-level learning rather than deeper L2 acquisition. The present Pižorn, Lemut Bajec 153 review is distinct in its up-to-date and comprehensive synthesis of current digital writing assistants, providing a more recent overview of the tools available in to- day’s educational context. The analysis further shows that AWCF encompasses a broad spectrum of elements, including grammar, vocabulary , punctuation, spelling, style, mechanics, sentence patterns, and discourse structures. This provides learners with multiple opportunities to refine their writing skills. While this aligns with the findings of all reviewed studies (cf. Cheng et al. 2022; Chen and Pan 2022; Hassanzadeh and Fotoohnejad 2021; Shang 2019), it is important to emphasise the correlation between language proficiency and the uptake of AWCF . Several studies have high- lighted that although AWCF may help reduce gaps in writing accuracy among learners of various proficiency levels, it does not significantly influence the com- plexity or fluency of their writing. This suggests that higher proficiency is required for improvements in these areas (Ko 2022; Xu and Zhang 2021). Additionally , Li et al. (2015) observed that higher-proficiency learners not only demonstrated great- er advancements but also exhibited stronger motivation to improve. Finally, this review indicates that both students and teachers generally hold positive attitudes towards the use of AWCF , with more advanced learners showing stronger preferences (Ko 2022). Regardless of individual differences, there is a clear tendency towards hybrid modes of feedback, underlining the indispensa- ble role of human interaction in developing writing skills (Sherafati and Largani 2022; Waer 2021). Several researchers (Adzhar and Sazalli 2024; Alharbi 2022; Tang et al. 2021) have even argued that AWCF without teacher guidance is of- ten ineffective because educators play a pivotal role in mitigating hypercorrection risks and clarifying ambiguities through careful feedback. Overall, social inter- action emerges as a decisive factor in fostering independent revision strategies (Cavaleri et al. 2019; Hyland and Hyland 2019). Although this review analyses a wide range of digital tools, it deliberately refrains from endorsing any single one for several reasons. First, selecting one tool over another introduces bias and may unintentionally serve as promotion. Second, determining the “best” tool is inherently subjective—what works well for one learner may not suit another, given differing needs and preferences. Third, the rapid pace of technological development means that tools evolve quickly, and any recommendation may soon become outdated. The primary goal, therefore, is not to rank tools but to provide a broad overview of available options that may inform or caution potential users. Most importantly, this review serves as a re- minder, albeit a familiar one, of the irreplaceable role of the teacher. In the face of growing digitalisation and the integration of artificial intelligence, the teacher’s role remains more vital than ever. Pedagogical implications and limitations This review offers several pedagogical implications. First, it highlights the importance of teachers having a solid understanding of AWCF tools and their Systematic review of digital writing assistants in EFL writing instruction 154 Sodobna pedagogika/Journal of Contemporary Educational Studies functionalities, which requires a high level of digital literacy among teachers. Second, it underscores the need for differentiated instructional strategies: while lower-proficiency learners may benefit more from feedback on form and gram- mar, higher-level learners should be guided towards improving structure and idea development. To scaffold their use, teachers might begin by demonstrating the tool in a guided setting, showing how to interpret feedback and apply it to sim- ple writing tasks, before gradually moving students to independent use on more complex assignments. Teachers can model the evaluation of AWCF suggestions by discussing why some automated corrections may not fit the writer’s intention, and encouraging students to justify , modify , or reject feedback. Overall, digital writing assistants should be incorporated into classroom settings as supplementary aids that promote independent, long-term utilisation. Since educators can introduce a variety of tools, they should also encourage learners to critically evaluate their usefulness and determine which tool best suits their tasks, rather than passively accepting automated suggestions. This approach fosters metalinguistic awareness and empowers students to make informed choices. Most importantly , despite tech- nological advances, the teacher’s role remains essential in guiding writing devel- opment, resolving ambiguities, contextualising feedback, and ensuring that the process supports learning goals. Finally, rather than using AWCF sporadically or for novelty , it should be systematically embedded into writing instruction, aligned with learning outcomes, and supported with clear guidelines on interpreting and applying feedback. This review has certain limitations. The decision to select only one electronic database may have resulted in the omission of potentially relevant publications due to selection bias. Another limitation concerns the specificity of the search terms, which may have excluded otherwise eligible studies. It is also important to acknowledge that review papers are inherently susceptible to publication bias (Garrecht et al. 2018). Finally , this review included only studies published in Eng- lish, likely excluding valuable insights from research conducted in other languag- es on AWCF writing assistants, thereby affecting the comprehensiveness and in- clusivity of the findings. Future research could address these limitations by employing multiple da- tabases, using broader search terms, and incorporating non-English studies to capture a wider range of perspectives on AWCF writing assistants. Additional- ly, longitudinal studies could examine the long-term effects of integrating digital writing tools in diverse classroom settings. Despite these limitations, this review makes a meaningful contribution by synthesising current knowledge on pedagogical implications of AWCF tools, high- lighting the teacher’s critical role. Moving forward, research and practice should explore the integration of AI-powered writing assistants into curricula, alongside targeted teacher training in digital literacy, to maximise the benefits for learners with different proficiency levels. Pižorn, Lemut Bajec 155 References Adzhar, N. B. and Sazalli, N. A. H. (2024). Written corrective feedback in the ESL Class- room: A systematic analysis of teachers’ beliefs, students’ perceptions, and preferenc- es. International Journal of Academic Research in Progressive Education and Devel- opment, 13, issue 1, pp. 1263–1289. Alharbi, M. A. (2022). Exploring the impact of teacher feedback modes and features on students’ text revisions in writing. Assessing Writing, 52, 100610. Bitchener, J. (2008). Evidence in support of written corrective feedback. Journal of Second Language Writing, 17, issue 2, pp. 102–118. Bitchener, J. (2012a). A reflection on the language learning potential of written CF . Journal of Second Language Writing, 21, issue 4, pp. 348–363. Bitchener, J. (2012b). Written corrective feedback for L2 development: Current knowledge and future research. Tesol Quarterly, 46, issue 4, pp. 855–860. Bitchener, J. and Ferris, D. R. (2012). Written Corrective Feedback in Second Language Acquisition and Writing (1st ed.). New York: Routledge. Bitchener, J. and Knoch, U. (2008). The value of written corrective feedback for migrant and international students. Language Teaching Research, 12, issue 3, pp. 409–431. Bitchener, J. and Knoch, U. (2009a). The relative effectiveness of different types of direct written corrective feedback. System, 37, issue 2, pp. 322–329. Bitchener, J. and Knoch, U. (2009b). The value of a focused approach to written corrective feedback. ELT Journal, 63, issue 3, pp. 204–211. Bitchener, J. and Knoch, U. (2010). Raising the linguistic accuracy level of advanced L2 writers with written corrective feedback. Journal of Second Language Writing, 19, issue 4, pp. 207–217. Black, P . and Wiliam, D. (1998). Assessment and classroom learning. Assessment in Educa- tion: Principles, Policy and Practice, 5, issue 1, pp. 7–74. Black, P . and Wiliam, D. (2009). Developing the theory of formative assessment. Education- al Assessment, Evaluation and Accountability, 21, issue 1, pp. 5–31. Cavaleri, M., Kawaguchi, S., Di Biase, B. and Power, C. (2019). How recorded audio-visual feedback can improve academic language support. Journal of University Teaching & Learning Practice, 16, issue 4, pp. 3–23. Chen, H. and Pan, J. (2022). Computer or human: a comparative study of automated eval- uation scoring and instructors’ feedback on Chinese college students’ English writ- ing. Asian-Pacific Journal of Second and Foreign Language Education, 7, issue 34, pp. 1–20. Cheng, G. (2019). Exploring the effects of automated tracking of student responses to teacher feedback in draft revision: evidence from an undergraduate EFL writing course. Interactive Learning Environments, 30, issue 2, pp. 353–375. Cheng, G., Chwo, G. S. M. and Ng, W . S. (2022). Automated tracking of student revisions in response to teacher feedback in EFL writing: technological feasibility and teachers’ perspectives. Interactive Learning Environments, 31, issue 8, pp. 5236–5260. Clark, I. (2012). Formative assessment: Assessment is for self-regulated learning. Educa- tional Psychology Review, 24, issue 2, pp. 205–249. Digitalna knjižnica Univerze v Ljubljani (DiKUL, b. l.). Dostopno na: https://tinyurl. com/42cvw7pd (pridobljeno 29. 8. 2025). Ding, M. (2024). Transforming assessment in education: A critical reflection. Communica- tions in Humanities Research, issue 47, pp. 67–72. Systematic review of digital writing assistants in EFL writing instruction 156 Sodobna pedagogika/Journal of Contemporary Educational Studies Dodigovic, M. and Tovmasyan, A. (2021). Automated writing evaluation: The accuracy of Grammarly’s feedback on form. International Journal of TESOL Studies, 3, issue 2, pp. 71–87. Ellis, R. (2009). A typology of written corrective feedback types. ELT Journal, 63, issue 2, pp. 97–107. Evans, N. W ., Hartshorn, K. J., McCollum, R. M. and Wolfersberger, M. (2010). Contextu- alizing corrective feedback in second language writing pedagogy. Language Teaching Research, 14, issue 4, pp. 445–463. Ferbežar, N. and Štemberger, T . (2023). Sistematični pregled literature v kontekstu pedagoškega raziskovanja. Sodobna pedagogika, 74, issue 1, pp. 28–43. Ferris, D. (1999). The case for grammar correction in L2 writing classes: A response to Truscott (1996). Journal of Second Language Writing, 8, issue 1, pp. 1–11. Ferris, D. (2003). Response to Student Writing: Implications for Second Language Students. New York: Routledge. Ferris, D. R. (2004). The “grammar correction” debate in L2 writing: Where are we, and where do we go from here? (and what do we do in the meantime…?). Journal of Sec- ond Language Writing, 13, issue 1, pp. 49–62. Garrecht, C., Bruckermann, T. and Harms, U. (2018). Students’ decision-making in ed- ucation for sustainability-related extracurricular activities: A systematic review of empirical studies. Sustainability, 10, issue 11, pp. 1–19. Grami, G. M. A. (2020). An evaluation of online and automated English writing assistants: collocations and idioms checkers. International Journal of Emerging Technologies in Learning (iJET), 15, issue 4, pp. 218–226. Guo, Q., Feng, R. and Hua, Y. (2021). How effectively can EFL students use automated written corrective feedback (AWCF) in research writing? Computer Assisted Lan- guage Learning, 35, issue 9, pp. 2312–2331. Han, T . and Sari, E. (2022). An investigation on the use of automated feedback in Turkish EFL students’ writing classes. Computer Assisted Language Learning, 37, issue 4, pp. 961–985. Hassanzadeh, M. and Fotoohnejad, S. (2021). Implementing an automated feedback pro- gram for a Foreign Language writing course: A learner-centric study: Implementing an AWE tool in an L2 class. Journal of Computer Assisted Learning, 37, issue 5, pp. 1494–1507. Hattie, J. and Timperley, H. (2007). The power of feedback. Review of Educational Re- search, 77, issue 1, pp. 81–112. Hendrickson, J. (1978). Error correction in foreign language teaching: Recent theory, re- search, and practice. Modern Language Journal, issue 62, pp. 387–398. Hyland, K. and Hyland, F . (eds.) (2019). Feedback in Second Language Writing: Contexts and Issues. Cambridge: Cambridge University Press. Kirmayr, M., Quilodrán, C., Valente, B., Loézar, C., Garegnani, L. and Franco, J. V. A. (2021). The GRADE approach, Part 1: How to assess the certainty of the evidence. Medwave, 21, issue 2. Ko, C. J. (2022). Online individualized corrective feedback on EFL learners’ grammatical error correction. Computer Assisted Language Learning, 37, issue 7, pp. 1449–1477. Li, J. Link, S. and Hegelheimer, V . (2015). Rethinking the role of automated writing evalu- ation (AWE) feedback in ESL writing instruction. Journal of Second Language Writ- ing, issue 27, pp. 1–18. Li, S. (2018). Corrective feedback in L2 speech production. In: J. Liontas (ed.). The TESOL Encyclopedia of English Language Teaching. Oxford: Blackwell, pp. 1–9. Pižorn, Lemut Bajec 157 Lyster, R. and Ranta, L. (1997). Corrective feedback and learner uptake: Negotiation of form in communicative classrooms. Studies in Second Language Acquisition, 19, is- sue 1, pp. 37–66. Mohsen, M. A. and Abdulaziz, A. (2019). The effectiveness of using a hybrid mode of auto- mated writing evaluation system on EFL students’ writing. Teaching English with Technology, 19, issue 1, pp. 118 – 131. Nagode, G. P ., Pižorn, K. and Juriševič, M. (2014). The role of written corrective feedback in developing writing in L2. ELOPE: English Language Overseas Perspectives and En- quiries, 11, issue 2, pp. 89–98. Nguyen, M. T . T . and Renandya, W . (2023). Written corrective feedback. In: H. Mohebbi and Y. Wang (eds.). Insights into Teaching and Learning Writing: A Practical Guide for Early-Career Teachers. Melbourne: Castledown Publishers, pp. 127–138. Nitko, A. J. and Brookhart, S. M. (2018). Educational Assessment of Students. Harlow: Pearson. Oflaz, M., Diker Coskun, Y. and Bolat, Ö. (2022). The effects of the technology-integrated writing lessons: CIPP Model of Evaluation. Turkish Online Journal of Educational Technology-TOJET, 21, issue 1, pp. 157–179. O’Neill, R. and Russell, A. (2019). Stop! Grammar time: University students’ perceptions of the automated feedback program Grammarly . Australasian Journal of Educational Technology, 35, issue 1, pp. 42–56. Page M. J, McKenzie J. E., Bossuyt P . M., Boutron I., Hoffmann T . C., Mulrow C. D., Sham- seer, L., Tetzlaff J. M., Akl, E. A., Brennan, S. E, Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T ., Loder, E. W ., Mayo-Wilson, E., McDonald, S., McGuinness, L. A., Stewart, L. A., Thomas, J., Tricco, A. C., Welch, V . A., Whiting, P . and Moher, D. (2021). The PRISMA 2020 statement: an updated guideline for re- porting systematic reviews. BMJ, 372, issue 71, pp. 1–9. Pižorn, K. (2014). Učinki učiteljeve pisne povratne korektivne informacije pri razvijanju pisne zmožnosti v tujem jeziku. Ljubljana: Zavod RS za šolstvo. Prasad, M. (2024). Introduction to the GRADE tool for rating certainty in evidence and recommendations. Clinical Epidemiology and Global Health, 25, 101484. Qian, L., Zhao, Y . and Cheng, Y . (2020). Evaluating China’s automated essay scoring system iWrite. Journal of Educational Computing Research, 58, issue 4, pp. 771–790. Readman, K. and Allen, B. (2013). Practical planning and assessment. Melbourne: OUP Australia. Reddy, Y. M. and Andrade, H. (2010). A review of rubric use in higher education. Assess- ment & Evaluation in Higher Education, 35, issue 4, pp. 435–448. Reynolds, B.L., Kao, CW. and Huang, Yy. (2021). Investigating the effects of perceived feedback source on second language writing performance: A quasi-experimental study. Asia-Pacific Education Researcher, 30, pp. 585–595. Shang, H. F . (2019). Exploring online peer feedback and automated corrective feedback on EFL writing performance. Interactive Learning Environments, 30, issue 1, pp. 4–16. Sherafati, N. and Largani, F. M. (2022). The potentiality of computer-based feedback in fostering EFL learners’ writing performance, self-regulation ability, and self-efficacy beliefs. Journal of Computers in Education, 10, pp. 27–55. Sherafati, N., Largani, F . M. and Amini, S. (2020). Exploring the effect of computer-me- diated teacher feedback on the writing achievement of Iranian EFL learners: Does motivation count? Education and Information Technologies, 25, pp. 4591–4613. Shute, V . J. and Rahimi, S. (2017). Review of computer-based assessment for learning in elementary and secondary education. Journal of Computer Assisted Learning, 33, issue 1, pp. 1–19. Systematic review of digital writing assistants in EFL writing instruction 158 Sodobna pedagogika/Journal of Contemporary Educational Studies Stevenson, M. and Phakiti, A. (2014). The effects of computer-generated feedback on the quality of writing. Assessing Writing, 19, issue 1, pp. 51–65. Stiggins, R. J. (2002). Assessment crisis: The absence of assessment for learning. Phi Delta Kappan, 83, issue 10, pp. 758–765. Tang, J., Qian, K., Wang, N. and Hu, X. (2021). Exploring language learning and correc- tive feedback in an eTandem project. Journal of China Computer-Assisted Language Learning, 1, issue 1, pp. 110–144. Thi, N. K., Nikolov, M. and Simon, K. (2022). Higher-proficiency students’ engagement with and uptake of teacher and Grammarly feedback in an EFL writing course. Inno- vation in Language Learning and Teaching, 17, issue 3, pp. 690–705. Truscott, J. (1996). The case against grammar correction in L2 writing classes. Language Learning, 46, issue 2, pp. 327–369. Truscott, J. (1999). The case for “The case against grammar correction in L2 writing class- es”: A response to Ferris. Journal of Second Language Writing, 8, issue 2, pp. 111–122. Truscott, J. (2004). Evidence and conjecture on the effects of correction: A response to Chandler. Journal of Second Language Writing, 13, issue 4, pp. 337–343. Truscott, J. (2007). The effect of error correction on learners’ ability to write accurate- ly. Journal of Second Language Writing, 16, issue 4, pp. 255–272. Truscott, J. (2010). Some thoughts on Anthony Bruton’s critique of the correction de- bate. System, 38, issue 2, pp. 329–335. Truscott, J. and Hsu, A. Y. P . (2008). Error correction, revision, and learning. Journal of Second Language Writing, 17, issue 4, pp. 292–305. Turk, N. (2021). Metodologija priprave sistematičnih preglednih člankov. Zdravniški vest- nik, 90, issue 7/8, pp. 432–442. Waer, H. (2021). The effect of integrating automated writing evaluation on EFL writing apprehension and grammatical knowledge. Innovation in Language Learning and Teaching, 17, issue 1, pp. 47–71. Wang, W . (2023). The efficacy of written corrective feedback: Searching for the best predic- tors. International Journal of Applied Linguistics, 34, issue 2, pp. 484–500. Wang, Z. (2020). Computer-assisted EFL writing and evaluations based on artificial intel- ligence: a case from a college reading and writing course. Library Hi Tech, 40, issue 1, pp. 80–97. Wilson, J., Olinghouse, N. G. and Andrada, G. N. (2014). Does automated feedback im- prove writing quality? Learning Disabilities: A Contemporary Journal, 12, issue 1, pp. 93–118. Winna, W. and Sabarun, S. (2023). The language assessment in teaching-learning Eng- lish. DIAJAR Jurnal Pendidikan Dan Pembelajaran, 2, issue 4, pp. 413–419. Xu, J. and Zhang, S. (2021). Understanding AWE Feedback and English writing of learn- ers with different proficiency levels in an EFL classroom: A Sociocultural Perspec- tive. Asia-Pacific Education Researcher, 31, pp. 357–367. Yousofi, R. (2022). Grammarly deployment (in)efficacy within EFL academic writing class- rooms: an attitudinal report from Afghanistan. Cogent Education, 9, issue 1, pp. 1–27. Pižorn, Lemut Bajec 159 Systematic review of digital writing assistants in EFL writing instruction Karmen PIŽORN (Univerza v Ljubljani, Pedagoška fakulteta, Slovenija) Melita LEMUT BAJEC (Univerza v Ljubljani, Pedagoška fakulteta, Slovenija) SISTEMATIČNI PREGLED DIGITALNIH POMOČNIKOV ZA PISANJE PRI RAZVOJU PISNIH SPRETNOSTI PRI ANGLEŠČINI KOT TUJEM JEZIKU Povzetek: Pisna povratna korektivna informacija (WCF) se nanaša na učiteljevo popravljan- je učenčevih napak. Pravilno umeščena v izobraževalni proces omogoča izboljšanje učenčeve pisne zmožnosti in razvoj celostnega razumevanja procesa pisanja. Avtomatizirana pisna korektivna povrat- na informacija (AWCF), ki se zagotavlja z uporabo digitalnih pomočnikov za pisanje, olajša učiteljevo delovno obremenitev, saj zmanjšuje čas in napor, ki sta potrebna za zagotavljanje povratne informacije. Poleg tega učencem omogoča, da samostojno pregledajo in popravijo svoje delo. Namen pričujočega sis- tematičnega pregleda literature je v skladu z vnaprej določenim PRISMA 2020 protokolom pripraviti pregled raziskav na temo digitalnih pomočnikov za pisanje in AWCF . V pregled je bilo vključenih 22 recenziranih člankov v angleškem jeziku, ki so izšli med letoma 2018 in 2023, in so obravnavali upo- rabo digitalnih pomočnikov za pisanje pri razvoju pisnih spretnosti pri pouku angleščine v primerjavi s tradicionalno podano povratno informacijo, ki jo pripravi učitelj, ali hibridnim načinom, ki poleg AWCF vključuje tudi učiteljevo usmerjanje. Udeleženci teh študij so bili tako dodiplomski študenti angleščine kot tujega jezika kot tudi študenti drugih smeri. Naloge so prvenstveno vključevale pisanje esejev, pa tudi odstavkov in raziskovalnih člankov. Ugotovitve kažejo, da uporabniki najbolj pohvalijo takojšnjo in natančno povratno informacijo, ki se nanaša na slovnico, stavčno strukturo, razvoj idej, izbiro besed, slog, kohezijo in koherentnost. Upoštevati je treba tudi pojav lažnih pozitivnih rezultatov, ko orodje neko pravilno strukturo označi za napačno. Uporabnik je na napake in izboljšave opozorjen na različne načine: z označevalniki, poudarjanjem, metajezikovnimi komentarji, neposrednimi predlo- gi, interaktivnimi lekcijami itd. Študije so pokazale, da so uporabniki digitalnih pomočnikov za pisanje učinkovito izboljšali svoje pisne spretnosti. Ker rezultati podpirajo vključevanje hibridnega načina, bi bilo smiselno digitalne pomočnike za pisanje načrtno vključiti v razredno prakso kot dopolnilne pripomočke, saj pri uporabnikih podpirajo razvoj metajezikovne zavesti in jih navajajo na samostojno učenje. Vendar pa uporaba AWCF orodij zahteva učitelje, ki razumejo in poznajo njihovo delovanje. Ocenjujemo, da je najpomembnejše spoznanje te sistematične pregledne študije prav v vlogi učitelja kot ključnega mediatorja pri pojasnjevanju povratnih informacij, reševanju nesporazumov in usklajevanju uporabe orodij z učnimi cilji. Ključne besede: digitalni pomočniki za pisanje, avtomatizirana pisna korektivna povratna infor- macija, sistematični pregled literature, angleščina kot drugi/tuji jezik, pisne spretnosti Elektronski naslov: karmen.pizorn@pef.uni-lj.si 160 Sodobna pedagogika/Journal of Contemporary Educational Studies Pižorn, Lemut Bajec Appendix A: Resources in the Digital Library of the University of Lju- bljana (15 April 2023) Academic Search Complete (EBSCO), AccessMedicine, ACM Digital Library, Advanced Technologies & Aerospace Database (ProQuest), AgEcon: Research in Ag- ricultural and Applied Economics, AGRICOLA Articles (NAL), AGRICOLA Books (NAL), AGRIS (FAO), Allgemeines Künstlerlexikon Online / Artists of the World Online (AKLO), American Chemical Society All Publications Package, American Medical Association (AMA journals), American Physical Society (APS) Journals, Analytical Abstracts, ArXiv.org, ASCE Journals, Avery Index to Architectural Pe- riodicals, L’Année philologique, Atla Religion Database with AtlaSerials PLUS, BAUFO (Bauforschungsprojekte), Beck online, BioMed Central Journals, Bizi.si, Business Source Complete (EBSCO), Business Source Premier (EBSCO), CAB Ab- stracts, Chemical Hazards in Industry, CINAHL Ultimate (EBSCO), Civil Rights and Social Justice (HeinOnline), Clinical Key Student, Cochrane Library (Wiley), Communication & Mass Media Complete (EBSCO), Credo Online Reference Ser- vice, Academic Core Collection, De Gruyter Online, Dela FDV , Delovni zvezki EF , Dentistry & Oral Sciences Source, Digitalna knjižnica BF , Directory of Open Ac- cess Journals, DXplain, East Europe & Central Europe Database (ProQuest), eB- ook Collection (EBSCO), EconLit with Full Text (EBSCO), Embase Ovid, Emerald Insight, Encyclopaedia of the Neo-Latin World, Encyclopedia Britannica Online Academic Edition, Encyclopedia of Library and Information Sciences, Encyclo- pedia of Slavic Languages and Linguistics On- line, ePrints FRI, ERIC (EBSCO), ERIC (ProQuest), FindINFO, Forest Science Database, FSTA (Food Science and Technology Abstracts), GreenFILE (EBSCO), Grove Art Online / Oxford Art On- line, Grove Music Online / Oxford Music Online, Gun Regulation and Legisla- tion in America (HeinOnline), GVIN.com, Hein online, Highwire Press, ICONDA Bibliographic, ICONDA CIB Library, IEEE/IET Electronic Library (IEL), IOP Science (Institute of Physics Journals), JapanKnowledge Lib, JSTOR, Keesing’s World News Archive, Kindlers Literatur Lexikon, KLGonline, Laboratory Hazards Bulletin, Library Literature & Information Science Full Text (EBSCO), Linguis- tics Collection (ProQuest), LISC - Library & Information Science Collection (Pro- Quest), LISTA Library, Informa- tion Science & Technology (EBSCO), Literature Resource Center (Gale), Loeb Classical Library, MasterFile Premier (EBSCO), Materials Science & Engineering Database (ProQuest), MathSciNet (AMS), Max Planck Encyclopedias of In- ternational Law, MECH, MEDLINE (EBSCO), MED- LINE (Ovid), Military & Government Collection (EBSCO), MLA International Bibliography (EBSCO), Musik in Geschichte und Gegenwart (MGG Online), Nat- ural Product Updates, Nature (NPG) Journals, Naxos Music Library , Naxos Music Library Jazz, Naxos Music Library World, Naxos Video Library, Newswires, The New Pauly Online, NAUTOS, OECD iLibrary, Oxford Academic Journals, Oxford English Dictionary Online, Oxford Reports on International Law , Oxford Scholar- ly Editions Online, Passport, PeFprints, PERINORM International, Political Sci- ence Complete (EBSCO), Preprinti IMFM, PressReader, ProQuest Business eB- ook Collection, ProQuest Dissertations &Theses Global (PQDT Global), ProQuest 161 Ebook Central Social Science, ProQuest One Business, PsycArticles, PsycINFO, PTSDpubs, PubMed, RDB Rechtsdatenbank, Regional Business News (EBSCO), RILM Ab- stracts of Music Literature (EBSCO), RILM Music Encyclopedias (EB- SCO), RSC Gold, SAGE Journals Online, Science Online (Science Magazine), Sci- enceDirect (Elsevier), SciFinder, Scopus, SHADIS, Slavery in America and the World: His- tory, Culture & Law (HeinOnline), Slovarji AMEBIS, Social Science Database (ProQuest), Social Services Abstracts (ProQuest), SocINDEX with Full Text (EB- SCO), Sociological Abstracts (ProQuest), Sources Chrétiennes Online, SPORT - Discus with Full Text (EBSCO), Springer Protocols, SpringerLink, Stati- sta, Synthetic Reaction Updates, Tax-Fin_Lex, Taylor & Francis Online, Technol- ogy Collection (ProQuest), TEMA® Technology and Management, Translations Studies Bibliography, Trip Database, UlrichswebTM, United Nations iLibrary, UNWTO eLibrary, WARC, Web of Science, Westlaw International, Wiley Online Library , WorldCat - OCLC, Writing Guides, Your Journals@Ovid Full Text (Ovid), zb- MATH Open Systematic review of digital writing assistants in EFL writing instruction