Abstract
Late 2022 saw the release of ChatGPT, the first of many powerful generative artificial intelligence (genAI) tools that quickly began transforming the way people work and study. As Japanese universities were cognizant of genAI’s increasing societal impact and its potential application in educational endeavors, they overwhelmingly left responsibility for genAI and other digital tool (DT) usage policy establishment and enforcement for individual courses to the instructors. Allowing instructors this free hand in determining genAI’s role in their courses seems both sound and appropriate, as (a) students’ point of contact with instructor aims and pedagogies as well as course material and objectives lie first and foremost with the instructors themselves, and (b) instructors are in the best position to monitor and discourage student genAI misuse. However, it is argued here that, given genAI’s unprecedented power and allure, simply leaving genAI policy monitoring solely to instructors is not only ineffectual as a student cheating deterrent but also burdensome, as it imposes numerous additional instructor labor costs. This paper supports this argument by providing analyses spanning a ten-year period of course failures resulting directly from the genAI and other DT misuse in English-only research reports submitted by Japanese university students enrolled in an English language lecture course. Analyses reveal a dramatic increase in course failure rates post-ChatGPT launch, specifically indicating instructor-level genAI policy monitoring ineffectiveness. GenAI misuse mitigation suggestions are proffered. Additionally, as other factors (e.g., pandemic influences) may exacerbate student genAI misuse, a call for further research is made.
Presenters
Brian RubrechtProfessor, School of Commerce English Department, Meiji University, Tokyo, Japan
Details
Presentation Type
Paper Presentation in a Themed Session
Theme
KEYWORDS
Generative AI, Academic Misconduct, Instructor Responsibilities, Japanese Universities, English Course