As instructors in the college of education within a large public university in the southeastern United States, we are keenly aware of both the benefits and challenges that arise at the intersection of artificial intelligence (AI) and education. Furthermore, as action researchers, we are inclined to explore the outcomes of AI integration by applying these tools within our practice.
This paper reports an action research study conducted across four teacher education courses (graduate and undergraduate) within a college of education summer semester that serves a diverse population across multiple campuses and degree areas, including educational leadership, special education, and school counseling. We review ways the current literature in this field informed our thinking, outline our AI integration and data collection procedures for the study, and discuss the results from both the instructors’ and the students’ perspectives.
Literature Review
The use of AI has been highly debated among scholars in recent years, with opinions divided concerning its utility. As learning methods evolve, educators must decide whether AI enhances student proficiency or its potential downsides warrant classroom restriction. To better understand this issue, it is important to consider the diverse range of literature available on the topic. AI is defined as a man-made system of computer-based algorithms and programs that can perform tasks typically requiring human intelligence (Chan et al., 2024). In an educational framework, AI is defined as a dynamic force that combines advanced technology with adaptive learning strategies, offering personalized academic experiences and enhancing tasks such as academic writing (Malik et al., 2023). These tools can analyze large amounts of data to provide customized feedback, enabling students to learn at their own pace and according to their unique needs.
Several AI-assisted tools are used daily, such as “learning management systems, online discussion boards, exam integrity, transcriptions of lectures, assisted chatbots,” and more (Chaudhry et al., 2023). Examples include ChatGPT, Elicit, Grammarly, and Duolingo (Farrokhnia et al., 2023; Fowler, 2023; Miranty & Widiati, 2021; Malik et al., 2023). These tools differ in their complexity and the kind of assistance they provide. While Grammarly and ChatGPT focus on enhancing writing and communication, others like Duolingo are designed for language learning. Elicit is a tool that assists with research by helping users find relevant academic resources.
The success of AI integration into education largely depends on educators’ readiness to adopt these technologies and the strategies they employ in the classroom (Leoste et al., 2021). However, as with any innovation, the use of AI in higher education presents issues surrounding its implications for both students and educators.
Challenges and Opportunities
As students increasingly use AI to handle tasks that once required independent thinking, concerns continue to rise that academic honesty is declining. Chaudhry et al. (2023) concluded that tools like ChatGPT make academic integrity even harder to track, given that students might use these tools to quickly generate an assignment, which questions their actual learning and development.
Alharbi (2023) argued that AI-powered tools may obscure a student’s language ability, making it challenging for educators to assess students’ writing capabilities. Other issues from researchers, such as Rodrigues (2020) and Lund and Wang (2023), insist on ethical issues, including data privacy and intellectual property. Besides that, ChatGPT has also been criticized because of the potential to foster plagiarism and cheating (Gašević et al., 2023).
Despite these challenges, AI tools like ChatGPT offer notable benefits. They can act as personalized tutoring systems that provide students with teaching and feedback on complex tasks, such as academic writing (Zhai, 2022). According to Gayed et al. (2022), AI writing tools enhance students’ writing competencies and confidence by providing immediate, actionable feedback that improves their skills. Unlike traditional search engines, ChatGPT goes a step further by not only providing quick responses but also efficiently summarizing information, making it a more comprehensive tool for academic support (Cascella et al., 2023; Farrokhnia et al., 2024).
It also helps students with tailored writing assistance, helping them craft better essays and improve their proficiency (Malik et al., 2023). Furthermore, Malik et al. discovered that AI tools refine writing skills and promote a better understanding of academic integrity, countering some of the concerns raised by Chaudhry et al. (2023).
Role of AI and Colleges of Education
Building on these benefits, several studies have investigated the use of Grammarly, an AI-based grammar and style checker, to enhance the writing performance of undergraduate students. Miranty and Widiati (2021) researched the usage of Grammarly and discovered that the students utilized the tool to overcome difficulties with in-class assignments and materials. They benefited from its automatic correction, checking for missing citations, and general support for their writing skills. Likewise, Cavaleri and Dianati (2016) indicated that more than 80% of the students participating in their study evaluated Grammarly as user friendly and said it facilitated autonomous learning. Students appreciated that Grammarly helped them understand and correct their mistakes without relying solely on their professors.
Further supporting this, Almusharraf and Alotaibi (2023) examined Grammarly’s effectiveness in improving English writing skills, particularly its ability to catch errors and provide corrective feedback. Interestingly, some studies even suggest that students prefer feedback from Grammarly over that of their instructors (O’Neill & Russell, 2019). However, multiple studies have emphasized the importance of not entirely depending on AI tools to catch all errors, as developing the ability to identify and correct mistakes independently remains an essential skill for students (Almusharraf & Alotaibi, 2023; Cavaleri & Dianati, 2016; Miranty & Widiati, 2021).
While undergraduate students primarily use assistant tools to catch errors in their writing, graduate-level students tend to engage with AI in more in-depth matters. Rice et al. (2024) suggested that ChatGPT can significantly aid graduate students in the research process by identifying relevant literature, evaluating sources, extracting and synthesizing information, identifying research gaps, generating hypotheses, and helping to formulate well-defined research questions. Similarly, Farrokhnia et al. (2023) and Rospigliosi (2023) stressed the usefulness of ChatGPT in assisting graduate students with drafting research proposals, a foundational step in the research process.
However, the development of a strong proposal is only the beginning. Graduate students also face the daunting task of comprehensive literature reviews, a critical component of academic research that all too often determines the scope and direction of graduate study. This process is further complicated by the sheer volume of scholarly publications — estimated at 2.5 million articles annually — which can overwhelm even the most diligent researchers (Xiang et al., 2024). To address this challenge, Xiang et al. argued that AI tools can assist graduate students in navigating the literature and managing citations. Their study found a “35% reduction in the time spent on citation tasks” when using AI tools compared to “traditional manual methods” (p. # OF QUOTATION?) This significant saving of time allows students to spend more time on the intellectual aspects of their work and fosters more profound engagement with the research material.
As active contributors to academia, graduate students are uniquely positioned to leverage AI tools like ChatGPT to navigate vast amounts of information and enhance the quality of their scholarly work. Dorado (2024) found that students who used AI tools during the conceptualization phase of their research produced more structured and cohesive papers. Their study also revealed that introducing AI tools in graduate courses prompted a variety of emotions among students, ranging from “excitement and curiosity” to “stress reduction” [P. # OF QUOTATION?] The feedback in the study emphasized AI’s potential to support graduate students with complicated tasks while easing some of the pressures of rigorous research demands.
Best Practices of Teaching With AI
As AI technology evolves, it is important to balance its benefits with its potential risks. Clear policies on AI usage can help maintain academic standards while encouraging students to use AI as a tool for learning rather than a shortcut (Fukuda‐Parr & Gibbons, 2021; Storey & Wagner, 2024). Teaching students how to use AI ethically and effectively is necessary to promote responsible research practices. By guiding students through the process of using AI tools in their academic writing, educators can help them develop the skills needed to use AI that will not compromise their integrity or hinder their learning (Xiang et al., 2024). Storey and Wagner expanded on this by addressing the need for AI literacy and academic honesty in higher education. They called for institutions to work together with students and administrators to create ethical frameworks that will address plagiarism, cheating on exams, fabricated citations, and copyright violations.
Policies cannot entirely remove every potential for unethical use, but they may serve as a guide toward what can be considered acceptable support to inspire critical thinking and creativity. After all, students, themselves, need to make honorable decisions regarding their use of AI resources in ways that improve, not hinder, their own learning. Recognizing the challenges associated with AI, our study built on the existing literature to examine the practical and ethical implications of AI integration in higher education. To bridge the gap between theory and practice, we employed an action research approach that leveraged our experience as educators to explore how incorporating and modeling AI tools in educational coursework impacted student perceptions and identifies best practices for their usage.
Methodology
As instructors within the college of education, we share a background in elementary classroom teaching. Our experience as classroom teachers has helped us see the immense value of studying your practice and making data-based decisions designed to create optimal learning experiences for students in our courses. We define action research as an extension of mastery teaching and a way to address the divide between theory and practice. Cochran-Smith and Lytle (2009) explained that action researchers continuously engage in a “process of problem posing, data gathering, analysis, and action” (p. 40).
A key hallmark of any action research study is the identification of the problem, often called the “problem of practice.” Generally defined, a problem of practice is “a complex and sizeable, yet still actionable, problem which exists within a professional’s sphere of work” (Henriksen et al., 2017, p. 142). As already discussed, the role of AI in teacher education qualifies as a complex and sizeable problem, and conducting action research assists us in not only gaining an understanding of the scope of the problem but also taking steps toward navigating our way through it.
Action research is a cyclical process, with the findings from one cycle often leading to the beginning of the next one. This study is our first documented cycle of collaborative action research within this problem of practice. As individual instructors, we had conducted informal cycles within our courses as we experimented with AI integration, but as the 2024 summer semester began, we initiated a formal action research project with the university, guided by the following questions:
- How does integrating and modeling AI tools in educational coursework impact undergraduate and graduate students’ perceptions of AI?
- What do instructor and student reflections reveal about the best practices of AI integration into education coursework at the undergraduate and graduate levels?
To address the research questions, we collected data throughout the courses using multiple methods. Student feedback was gathered through surveys administered after each course, targeting undergraduate and graduate students. The survey (see appendix) consisted of 10 questions, which were administered virtually during the semester by both instructors and asked students to reflect on how their perceptions and knowledge of AI were impacted throughout the course. Participation in the survey was anonymous and optional to ensure that it would not impact students’ grades and to minimize bias from the instructors over individual responses. Since all students were preservice or in-service teachers, the survey also asked about their perceptions of AI’s role in K-12 and higher education classrooms.
Additionally, graduate students participated in AI reflection discussion boards three times during the semester, allowing for ongoing reflection as they learned more about AI tools. As action researchers, we also contributed to the data collection through self-reflections recorded at the beginning (preplanning meeting), middle (check-in meeting), and end of the semester (debrief meeting). Finally, course evaluations were completed once, at the end of the course, by both undergraduate and graduate students. This comprehensive approach to data collection provided a well-rounded view of the AI integration experience, integrating perspectives from students and the instructor at various points during the semester.
The survey data served as a foundational component for analysis, offering an initial understanding of student perceptions and experiences. The data were analyzed using an open-coding process at the question level, enabling the identification of shared and recurring experiences reported by participants. Following this analysis, broader categories were developed by grouping responses to questions that reflected similar themes (a process recommended by Merriam & Grenier, 2019). These categories were further examined to identify overarching themes, providing insight into the relationship between individual student experiences, instructional practices, and actionable recommendations for instructors. Last, student evaluation data from the university course evaluation system and our instructor reflections were analyzed for additional evidence of instructional practice changes from the action research cycle.
Undergraduate AI Integration
At the undergraduate level, the focus of AI integration was to expose and model the different types of AI tools available and demonstrate how these tools could be used for undergraduate coursework and, more importantly, as tools for success as future classroom teachers. The two undergraduate courses included in the study were a reading methods course targeting instruction in Grades 3-8 and a reading diagnosis and assessment course. AI integration points were predetermined in both courses, including recorded lectures, discussion boards, and assignment components.
Recorded Lectures
These courses were administered in completely virtual formats, making the recorded lectures a valuable point of reference for students. The reading methods course offered greater opportunities for integrating and applying AI. In contrast, the reading diagnosis and assessments course provided initial exposure to AI but faced constraints due to program alignments and state compliance requirements, which limited students’ potential AI usage. Within each course, a lecture was provided to outline which AI tool was used (e.g., ChatGPT or CoPilot), how it was used, and how the students might potentially implement it. These multifaceted lectures became a space where AI was discussed within and beyond the course content. Additionally, the lectures modeled ways AI tools could enhance their current assignments and brainstorm future practical uses in classrooms.
Discussion Boards
Both courses had a discussion board assignment that allowed students a space to discuss AI tools, concerns, benefits, and best practices they encountered either during this course or through personal use of AI. Students used this space to engage in dialog about possible uses, different types of AI, and concerns.
Assignment Components
Both courses used AI to generate ideas for the lesson plans that students created in the course. Students shared their initial thoughts, how they used those thoughts in conjunction with AI, and which AI tool they used. They also reflected on the AI-generated product. Some students included the AI component in their final lesson plan and a citation for the AI tool used, while others omitted the AI options in favor of their lesson plan elements.
Graduate AI Integration
The primary focus for integration at the graduate level was modeling appropriate AI use and explicit instruction on where AI can and should be used to enhance course assignments. This study included two graduate courses: a curriculum design course and a curricular trends and issues course. Three integration points were chosen for AI tools in these courses: videoconferencing sessions, discussion boards, and beginning assignments.
Videoconferencing Sessions
While both courses were taught fully online, synchronous videoconferencing sessions on the Zoom platform were held and recorded twice in each course. The first session occurred within the first 3 weeks of the semester, and the second occurred between weeks 6 and 7. Each session lasted an hour, with the intention of orienting students to upcoming assignments and addressing questions as a whole group. During these sessions, the instructor modeled using AI tools (e.g., Chat GPT, elicit.com, and CoPilot) to enhance or begin specific assignments. Students were encouraged to follow along with the instructor to try the tools as they were modeled.
Discussion Boards
Each course included a discussion board assignment that specifically asked students to use AI to complete an activity (e.g., generate ideas for aspects of a curriculum unit, brainstorm research questions on a specific topic, suggest assessment ideas for a unit), and then reflect on their experience using AI.
Beginning Assignments
Both graduate courses followed a project-based model, where students worked on smaller portions of a larger project (e.g., curriculum unit and research paper) throughout the course. During the videoconferencing sessions, the instructor highlighted how a tool like elicit.com could assist in finding research articles or how ChatGPT could be used for idea generation. Students were encouraged to use AI in these ways and were asked to disclose how they used AI within the comment box on their assignment submission, so it was clear to the instructor if they chose to do so.
Findings
The Impact of AI Integration
To answer our first research question about the impact of exploring and modeling AI tools on students’ perceptions, we used information from our survey, specifically focusing on Questions 3, 4, and 9. Question 3 asked students how AI has positively impacted their learning experience, Question 4 asked how modeling had impacted their use of AI, and Question 9 centered around any changes to their perceptions of AI throughout the semester.
We found that 90% (n = 22) of undergraduates and graduates had used AI before these courses. However, these courses provided new and explicit ways to use AI that they found beneficial and applicable to their coursework. These ways included using AI as a “supplemental tool,” allowing space to explore AI ethically with an instructor, and learning different AI tools and uses (e.g., pinpointing main points, generating ideas, and managing assignments). One student stated, “She showed us how to navigate it properly, allowing assignments to become more manageable.”
Additionally, 80% (n = 8) of undergraduates reported that AI positively improved their writing and grammar. In comparison, 58% (n = 7) of graduate students found that AI positively impacted their ability to gather research. This data highlights the importance of differentiating the purpose and the AI tool for different courses and levels of students. In this study, undergraduates found AI helpful in writing, whereas graduate students relied on AI to support their development as researchers. Figure 1 details how undergraduates and graduates reported the positive impacts of AI on their learning during the summer semester.
Figure 1
Reported Positive Impact of AI on Student Learning

Finally, a majority, 68% (n = 22) of undergraduates and graduate students had a positive change or positive view of AI at the end of the course. One graduate student said that AI provides ideas on ways to optimize the curriculum, which frees up extra time for students. An undergraduate student also indicated a positive change expressing that it did not do the work for her but instead helped her creatively expand her ideas.
Graduate students were required to disclose their AI use in the comment box when they submitted their assignments to the Learning Management System (LMS). In the graduate curriculum design course, 67% (n = 12) of students reported using AI to assist them in developing a curriculum unit. In their comments, they noted that ChatGPT was a valuable tool that provided them with a range of ideas and examples for the activities and assessments within their unit and assisted in organizing their ideas for their final product. One graduate student expressed enjoyment using AI to enhance her project, specifically the ability of AI to provide alternative activities, clarify concepts, and organize content. Again, this also allowed the student more time to focus on creative unit elements.
Most students in a graduate education program are veteran teachers, and the addition of AI to their rich database of ideas resulted in creative units firmly rooted in foundational educational concepts. One of the veteran educators in the course explained that AI suggested potential themes, activities, and assessment methods that both enriched and aligned with standards. Also, templates, rubrics, and further alignment across the unit concepts and structure were extremely beneficial.
Overall, the intentional modeling and integration of AI within undergraduate and graduate coursework positively impacted students’ perceptions of AI throughout the semester. Modeling allowed students to see practical and ethical uses of AI in their coursework and provided opportunities to discuss potential applications in K-12 classrooms.
Student Reflections on AI Integration Best Practices
Our second research question aimed to contribute to the growing field of recommended instructional practices using AI by examining student and instructor reflections at the end of the summer term. Specifically, our survey asked students to reflect on how the guidance from their instructor impacted their use of AI and what advice they would give other instructors about using AI in coursework. The qualitative responses to those questions yielded important insights into pedagogical decisions surrounding AI integration.
Students overwhelmingly reported a need for proper guidance on using AI and appreciated their instructors’ providing clear guidance on using AI tools effectively and ethically. This finding was evident across both undergraduate and graduate responses. One student noted, “It was great to think about how I could use AI as a meaningful supplementary tool. The professor helped by encouraging me to try using AI for some assignments and asking me to reflect on the experience.”
AI was not new to many students, so intentionally providing instructor guidance helped to direct students to areas where it was allowed and appropriate. One student noted this difference in their experience by explaining, “Using AI is usually discouraged from most of my professors so I would say it didn’t really impact me because I would use it anyways.” This statement may indicate how many students are operating within coursework and represents vastly different stances on AI between instructors and students.
Bridging that gap in the perception of AI’s role in coursework is essential to supporting students to navigate this new technology. When asked to advise other instructors about AI integration, a student commented, “Perhaps adding elements that allow for a combination of AI and critical thinking skills, to teach ways to use it fairly.” Another student suggested that instructors, “be open as possible to utilizing AI for good cause but to also not rely on them.”
Students gave suggestions that indicated a balanced approach to AI integration, finding meaningful ways to integrate AI without taking away the ability to flex their creative thinking. In other words, “Only use it if it makes sense. Don’t force it.”
Specifically, students pinpointed areas in education coursework that benefited greatly from AI tools. An undergraduate student suggested, “It should be incorporated during lesson planning. AI is such a great resource to bring ideas into the classroom!” As a preservice teacher, experience in lesson ideas may be limited to what types of lessons the students have observed or read about in their coursework. Building from a larger database of ideas could be an asset to bridge the gap in experience between a preservice and a veteran educator. As previously stated, graduate students noted the great value of AI tools in supporting the development of their research skills. A graduate student said, “Using AI to find scholarly articles was a game changer. AI was able to pinpoint scholarly articles that pertain directly to my coursework and research in order to be more efficient and effective.”
Again, students see the value of using AI to bridge a gap in experience. New graduate students may not have a depth of research experience, especially if their undergraduate degree was focused on meeting initial teacher certification requirements. Using AI as a support to build their research skills may be a valuable way to support teacher researchers within their graduate coursework.
Instructor Reflections on AI Integration Best Practices
To complement the suggestions from students on best practices of AI integration, we analyzed data from both instructors’ Student Perception of Teaching (SPOT) evaluations completed by students (postcourse) and notes from three instructor reflection meetings (precourse, midcourse, and postcourse). Using SPOT evaluation data, students indicated that addressing AI within the course did not impact the quality of the learning experience. The undergraduate instructor received an average of 98% of students selecting “completely agree” on SPOT evaluations, indicating that the instructor encouraged critical thinking across both summer courses. Similarly, the graduate instructor averaged 95% of graduate students selecting “completely agree,” reflecting that critical thinking was also encouraged in their courses. With previous research and students indicating ethical concerns surrounding AI, ensuring the quality of instruction was a top priority for instructors.
The analysis of instructor reflections revealed several strengths and challenges following AI integration through the summer semester. Strengths of AI integration included the ability to differentiate student and course content needs easily and quickly. Across all courses in this study, students had choices built into many course assignments. For example, undergraduate students could choose the focus of the lesson they were writing, and graduate students could choose the topic they were researching. With so much variation, it can be difficult for an instructor to find quality examples and instructional material that pertain to what each student has chosen to work on.
Using tools like ChatGPT and elicit.com, we could quickly change prompts to show a myriad of examples to students and talk through aspects of the project with real-time examples. Similarly, students could work alongside us with the tool as we were modeling prompts and assessing the outcome of what was given to us through AI. To do something similar, we would have had to save exemplary examples from previous students over many semesters and obtain permission to share their work with new students. The ability to access such a wide range of resources was certainly a strength of this AI integration. Table 1 outlines the specific assignments we each chose for AI integration, how we modeled AI use, and our recommendations for future integrations.
Table 1
| Assignment | AI Integration | Method for Modeling | Recommendations for Future Integration |
|---|---|---|---|
| Literature Review | Used AI (ChatGPT) to brainstorm research questions on a topic and to use that research question to search for academic articles (elicit.com) | Demonstrated AI tools during live, virtual sessions | Provide clear guidelines on how to use and cite AI tools in assignments |
| Rubric Design | Used AI (ChatGPT) to create a rubric based on curriculum unit objectives and share it with peers for feedback | Asked students to use a specific prompt (ChatGPT) to produce a rubric and then share it in a discussion board post and reflect on how they would modify it for their unit | Provide multiple prompts to help students revise the rubric to assignment specifications |
| Backward Design Curriculum Unit | Used AI to generate ideas for unit focus and generate ideas for performance assessments | Demonstrated AI tools during live, virtual sessions | Provide instruction on evaluating the results from AI for accuracy and practical use |
| Lesson Planning | Used AI (ChatGPT, CoPilot) to brainstorm grade-level text and lesson plan ideas | Demonstrated AI tool during a virtual session | Provide examples at multiple grade levels |
| Literature Rationale | Used AI (ChatGPT, CoPilot) to compare students' annotated bibliographies of a text to one created by AI | Modeled an example and used a discussion board where students commented on the pros and cons of the AI-generated response compared to their response | Provide reflection prompts to allow for a deeper analysis from students |
While there were many benefits from integrating AI into our courses, there were reflections surrounding the challenges and suggestions for revision moving forward. Most notable in our reflection data was the shared need to provide additional instruction on how to revise the products produced by AI. This could include an analysis of the types of academic articles provided by elicit.com or the feasibility of the curriculum ideas provided by ChatGPT. In each instance, we reflected that it would have been beneficial to model this evaluation step and include a discussion of what might need editing or how to spot false information.
While not necessarily a challenge but a potential solution to a challenge shared within the field of education, we found a great benefit from adding an AI disclosure requirement to each assignment. It was an opportunity to gain immediate feedback on how many students were using AI and which portions of the assignment. It also allowed students to be honest about their use of AI without fear of repercussions. We consistently reminded students that the final product for any assignment should be the student’s work, but knowing where they had used AI helped us to give targeted feedback on how to support their learning.
Implications
As a result of this study, there are several implications for practitioners regarding the integration of AI tools in coursework. Over 25% of our students reported that they rarely or never used AI tools to assist with their assignments or coursework, indicating that many students are unfamiliar with navigating the appropriate use of AI tools. To address this, we recommend explicit modeling lessons where instructors demonstrate how AI tools can be used on course-specific activities and assignments. Ideally, these lessons occur in real-time (while also recorded) so students can work alongside the instructor while the AI tool is being modeled. Additionally, we recommend creating an AI disclosure method for assignments so students can be transparent about using AI tools. This can coincide with a discussion or lesson about how to cite AI within their work to avoid plagiarism properly.
The study highlights the importance of differentiating AI tools and their purposes for different courses and levels of students. For example, undergraduates found AI helpful for writing, while graduate students used AI to support their research. AI tools can help bridge the gap in experience between preservice and veteran educators by providing a larger database of ideas and resources. This can be particularly useful in areas like lesson planning and research. Instructors should find meaningful ways to integrate AI that complement students’ skills.
There is much work to be done around critically examining, refining, and editing AI-produced products so that the final assignment will be a product produced by the student. Both students and instructors should engage in ongoing reflection and adaptation of AI use in coursework. This can help identify best practices and address any ethical concerns or challenges.
Last, teacher education courses are uniquely positioned in this discussion because, as instructors, we are not only integrating AI into our practice but also shaping the practice of future and current classroom teachers through modeling intentional and thoughtful technology integration. Our responsibility as teacher educators is to critically examine new technologies and work alongside our students to assess their value through collaborative feedback. By sharing our work as action researchers with our students, we hope to provide them with a blueprint to use research to make instructional decisions within their classrooms.
Limitations
While this study provides actionable insights for teacher educators to consider when implementing AI into their instructional practices, the study has some limitations. First, the study relied primarily on self-reported data of students’ experiences with AI and their perception of AI tools. This data may be subjective and may not accurately reflect the instructional practices within the course or other courses the students discussed. As is standard in action research, the instructor also served as the researcher, which could result in bias among student responses. However, all data collection was voluntary and occurred outside of graded assignments. Last, this study was conducted with a single college of education; future research should be conducted across multiple disciplines to generalize the findings.
References
Alharbi, W. (2023). AI in the foreign language classroom: A pedagogical overview of automated writing assistance tools. Education Research International, 2023, 1–15. https://doi.org/10.1155/2023/4253331
Almusharraf, N., & Alotaibi, H. (2023). An error-analysis study from an EFL writing context: Human and automated essay scoring approaches. Technology, Knowledge and Learning, 28(3), 1015-1031. https://doi.org/10.1007/s10758-022-09592-z
Cascella, M., Montomoli, J., Bellini, V., & Bignami, E. (2023). Evaluating the feasibility of ChatGPT in healthcare: An analysis of multiple clinical and research scenarios. Journal of Medical Systems, 47(1), 1–5. https://doi.org/10.1007/s10916-023-01925-4
Cavaleri, M. R., & Dianati, S. (2016). You want me to check your grammar again? The usefulness of an online grammar checker as perceived by students. Journal of Academic Language and Learning, 10(1).PP #S?
Chan, C. K. Y., & Colloton, T. (2024). Generative AI in higher education: The ChatGPT effect. Taylor & Francis Group.
Chaudhry, I. S., Sarwary, S. A. M., El Refae, G. A., & Chabchoub, H. (2023). Time to revisit existing student’s performance evaluation approach in higher education sector in a new era of ChatGPT – A case study. Cogent Education, 10(1). PP#s? https://doi.org/10.1080/2331186X.2023.2210461
Cochran-Smith, M., & Lytle, S. L. (2009). Teacher research as stance. The Sage Handbook of Educational Action Research. Sage, 39-49. FORMAT PER APA STYLE
Dorado, L. B. (2024). Lived experiences of graduate students in writing their academic research paper using artificial intelligence (AI) tools. 11th ISC 2024: Research and Education Sustainability: Unlocking Opportunities in Shaping Today’s Generation Decision Making and Building Connections. FORMAT PER APA STYLE
Farrokhnia, M., Banihashem, S. K., Noroozi, O., & Wals, A. (2024). A SWOT analysis of ChatGPT: Implications for educational practice and research. Innovations in Education and Teaching International, 61(3), 460–474. https://doi.org/10.1080/14703297.2023.2195846
Fowler, D. S. (2023). AI in higher education: Academic integrity, harmony of insights, and recommendations. Journal of Ethics in Higher Education, 3, 127–143. https://doi.org/10.26034/fr.jehe.2023.4657
Fukuda‐Parr, S., & Gibbons, E. (2021). Emerging consensus on ‘ethical AI’: Human rights critique of stakeholder guidelines. Global Policy, 12(S6), 32–44. https://doi.org/10.1111/1758-5899.12965
Gašević, D., Siemens, G., & Sadiq, S. (2023). Empowering learners for the age of artificial intelligence. Computers and Education: Artificial Intelligence, 100130. https://doi.org/10.1016/j.caeai.2023.100130
Gayed, J. M., Carlon, M. K. J., Oriola, A. M., & Cross, J. S. (2022). Exploring an AI-based writing assistant’s impact on English language learners. Computers and Education. Artificial Intelligence, 3, 100055. https://doi.org/10.1016/j.caeai.2022.100055
Henriksen, D., Richardson, C., & Mehta, R. (2017). Design thinking: A creative approach to educational problems of practice. Thinking Skills and Creativity, 26, 140-153.
Leoste, J., Jõgi, L., Õun, T., Pastor, L., San Martín López, J., & Grauberg, I. (2021). Perceptions about the future of integrating emerging technologies into higher education—The case of robotics with artificial intelligence. Computers (Basel), 10(9), 110-. https://doi.org/10.3390/computers10090110
Lund, B. D., & Wang, T. (2023). Chatting about ChatGPT: How may AI and GPT impact academia and libraries? Library Hi Tech News, 40(3), 26–29. https://doi.org/10.1108/LHTN-01-2023-0009
Malik, A. R., Pratiwi, Y., Andajani, K., Numertayasa, I. W., Suharti, S., Darwis, A., & Marzuki. (2023). Exploring artificial intelligence in academic essay: Higher education student’s perspective. International Journal of Educational Research Open, 5, 100296-. https://doi.org/10.1016/j.ijedro.2023.100296
Merriam, S. B., & Grenier, R. S. (Eds.). (2019). Qualitative research in practice: Examples for discussion and analysis. John Wiley & Sons.
Miranty, D., & Widiati, U. (2021). An automated writing evaluation (AWE) in higher education. Pegem Eğitim ve Öğretim Dergisi = Pegem Journal of Education and Instruction, 11(4), 126-137. https://doi.org/10.47750/pegegog.11.04.12
O’Neill, R., & Russell, A. (2019). Stop! Grammar time: University students’ perceptions of the automated feedback program Grammarly. Australasian Journal of Educational Technology, 35(1), 42-56. https://doi.org/10.14742/ajet.3795
Rice, S., Crouse, S. R., Winter, S. R., & Rice, C. (2024). The advantages and limitations of using ChatGPT to enhance technological research. Technology in Society, 76, 102426.
Rodrigues, R. (2020). Legal and human rights issues of AI: Gaps, challenges and vulnerabilities. Journal of Responsible Technology, 4, 100005. https://doi.org/10.1016/j.jrt.2020.100005
Rospigliosi, P. (2023). Artificial intelligence in teaching and learning: What questions should we ask of ChatGPT? Interactive Learning Environments, 31, 1-3. https://doi.org/10.1080/10494820.2023.2180191
Storey, V., & Wagner, A. (2024). Integrating artificial intelligence (AI) into adult education: Opportunities, challenges, and future directions. International Journal of Adult Education and Technology, 15(1), 1-15. https://doi.org/10.4018/IJAET.345921
Wagner, G., Lukyanenko, R., & Paré, G. (2022). Artificial intelligence and the conduct of literature reviews. Journal of Information Technology, 37(2), 209-226. https://doi.org/10.1177/02683962211048201
Xiang, S., Deng, H., Wu, J., & Liu, J. (2024). Exploring the integration of artificial intelligence in research processes of graduate students. In proceedings of 2024 6th International Conference on Computer Science and Technologies in Education (pp. 110–113). https://doi.org/10.1109/CSTE62025.2024.00027 Zhai, X. (2022). ChatGPT user experience: Implications for education.https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4312418
Appendix
AI Reflection Survey



![]()