These selected case studies were written by course participants for their final reflective task.
The situation occurred while I was preparing for my classes at The American College, Madurai, where I used AI tools like ChatGPT to assist in designing classroom tasks and interactive activities for my students. The goal was to create engaging activities that would foster critical thinking, creative expression, and a deeper understanding of subjects like Indian English Literature and Communication Skills. I used ChatGPT to develop exercises, quiz questions, refine academic content, and structure complex ideas in simpler terms, which improved the flow of my course materials. The decision to use AI stemmed from my desire to enhance lesson planning efficiency and create a more dynamic learning environment. The tool provided quick, relevant suggestions and offered new perspectives that I hadn't considered, contributing to a well-rounded curriculum. My experience with ChatGPT has been positive. It acted as a helpful assistant, streamlining my work and offering creative solutions for complex academic tasks. This experience has shown how AI can be an effective tool for educators, particularly in curriculum development, to design personalized and innovative learning experiences. Other teachers, especially those focused on curriculum or tech-enhanced teaching, may find similar benefits from using AI in their classrooms.
The use of AI (ChatGPT) in preparing classroom tasks offers significant benefits, particularly in enhancing lesson planning efficiency and fostering more creative, engaging learning environments. It saves time, allowing teachers to focus on personalized student interactions and innovative teaching methods. However, over-reliance on AI could potentially stifle teachers' creativity and critical thinking skills, though this harm is minimal compared to the benefits. From an ethical perspective, AI usage respects privacy, consent, and ethical boundaries, as teachers retain responsibility for classroom decisions, with AI serving as a supportive tool. Virtue theory highlights how AI encourages creativity, curiosity, and responsibility, requiring teachers to evaluate content critically to ensure alignment with educational goals. In terms of justice, AI offers advantages to teachers but raises concerns about equitable access, potentially widening the gap between privileged and less-resourced educational contexts. However, AI is meant to supplement, not replace, human teaching, ensuring that the human element of education remains central. The key is using AI inclusively and responsibly to avoid reinforcing inequalities. Overall, when used responsibly, AI is a valuable tool in education, enhancing teaching quality, fostering creativity, and enriching the learning experience.
After evaluating the use of AI (ChatGPT) in preparing classroom materials, my conclusions emphasize its positive impact on teaching, particularly in terms of efficiency and creativity. While consequentialism highlights the benefits of time-saving and personalized learning, it also warns against over-reliance on AI, which could stifle teachers' creativity and critical thinking. Rights and rules confirm that AI respects ethical boundaries, as teachers remain responsible for content, and no privacy or consent issues arise. Virtue theory supports using AI to enhance creativity and responsibility, aligning with the belief that it should complement, not replace, human judgment. Justice raises concerns about equitable access, underscoring the importance of making AI tools available to all educators, especially those in under-resourced environments. The most critical ethical features for me are justice and virtue theory, as ensuring AI's responsible and inclusive use is essential. In response, I plan to use AI as a supplementary tool, ensuring it supports, rather than replaces, human creativity in the classroom. I will advocate for equitable access to AI tools and share the benefits and ethical considerations with colleagues to encourage a balanced, responsible approach in education.
This semester, the teaching team I am part of has decided to include a short reflection practice at the end of the class. To facilitate students' reflection and data gathering, we used a digital platform […], initially designed for self-assessment and feedback provision by peers and teachers. The tool has a built-in AI writing assistant, which would prompt students towards specific directions for deeper reflections. By conducting the reflection practice using [the digital platform], we hoped students could synthesise what they have learned in one session and reflect on their learning, which would also contribute to the final reflection assignment students need to do for the course. The tool also enabled a private communication channel between students and the teacher, as their reflective writing is not visible to other classmates. There were both good and bad experiences with the tool and the reflection practice. On the positive side, the privacy encouraged several genuine responses from students, the most memorable one being a student confessing to me that, being an introvert, he felt being invited to contribute to the class verbally extremely stressful. On the negative side, while AI tools could facilitate reflection by prompting the student, students also used AI tools to write their reflections. By negative, I do not mean that AI-assisted writing, in whole, is harmful. It is just that a few students directly used AI-generated responses for their reflection task, which meant that potential learning gains through the activity were missed.
In this situation, AI technologies were used for multiple purposes. From our perspective as teachers, the use of an AI-powered tool is meant to help student reflection. Without documentation of how student reflection evolved during the process in response to the AI prompts, it would be hard to judge whether the goal was achieved. However, when using the tool for the first time in the classroom, I heard a few students complaining about the tool constantly asking for more details and becoming more critical, which could have put pressure on students who intended to follow the guidance. Another goal of conducting this technology-aided activity is to establish a private channel for student-teacher communication and promote honest reflection, which seemed to have benefited some students. A particularly benefited group was students who were shy about talking openly in the classroom. By creating a space for their voices (e.g., the confession exemplified in the first section) and eliciting their learning outcomes, we could know the students better and assess their learning, ensuring equal opportunities for participation and care for these students. Students, on the other hand, use AI tools to facilitate their writing. For those only using the tool to support and polish their writing, the tool is beneficial and lowers the language barrier for the student to participate. I have repeatedly received on-topic and elaborate reflections from a few students who seem to experience struggles articulating ideas in the classroom. With this activity, their voices were captured, and accompanied by the follow-up feedback I provided, their ideas were appreciated, which would be challenging if students were only provided one channel of communication in the classroom. For other students who used AI tools to directly generate content to fulfil the task requirement, their learning was harmed without the students realising the consequence. Additionally, by pretending to have submitted content written themselves, the students failed to demonstrate the expected honesty.
From the rights and rules as well as a justice lens, the use of technology widens the opportunity for student participation and allows each student's voice to be heard. However, from the consequentialism lens, student confusion and pressure are a concern, and lost learning opportunities for students who "outsource" the writing task are an undesirable outcome. More importantly, authenticity and honesty are at risk: while the use of AI tools helped students with more limited language proficiency to truthfully convey their ideas, students relying on the AI-generated content breach academic integrity, which reflects badly on their character. Therefore, to leverage the benefits and control the potential negative impacts, I would better educate students on why and how AI tools could be used in this reflection task.
The year was 2019. Pre-covid. Chatgpt was just being murmured about. People going crazy with this new technological advancement. As usual teacher are slow to catch up with students on the technology turf. In a university named SHENBAGAM (named changed) in the old and traditional city of Madurai in South India, creative writing was a much celebrated event. Creative Writing Workshops were held annually on a hill station far away from the university to give students the freedom from academic pressure to run their imagination wild. After that workshop, monthly creative writing sessions would be held in the university itself by a Professor. Students would be given prompts to write poems, flash fiction, monologues, dialogues and reflective essay- one each month. students felt that such exercises kindled beyond measure their imagination and unbridled thinking. The goal was to make students understand the significance of creative writing/expression in an era driven by technology. In one such session, the teacher gave a prompt for witing a poem in not more than 20 lines. The teacher always insisted pen and paper method. Laptops and tabs were discouraged. But that day a student sneaked in a laptop and prevailed upon the teacher to let him use it. The teacher reluctantly agreed. To the shock and surprise of the teacher, the student swiftly came back with a finished poem. The teacher could not believe it and he tried to argue that the student must have obtained it from google search. He refused it, with the smirk on his face all along. The teacher understood that the student was playing with the teacher. Then after some time the student yielded saying that he had used ChatGPT, the in-thing at that time. he also explained what chatgpt was to the teacher. The teacher had to remind him that creative writing was best done with pen and paper. This event prompted the teacher and the organisers of the annual Creative Writing Workshop to ban technological devices from the sessions. In fact pen and paper method won. This situation brings into focus the use of ChatGPT in teaching and experiencing creative writing. In fact, it seems to have a lot of adverse effect in Humanities education, particularly in literary studies where AI and other similar models must be used with much precaution. In the given case, the particular student had technological advantage over other students, did not think of it as an ethical issue or perhaps he was not properly oriented to the emerging technological devices. It is also true that, in this case, the student triumphantly proclaiming that he had finished counteracts the very idea of creative writing. The process was the focus of the teacher more than the completion of the task.
The incident that has been described throws up a few important questions about the use and consequences of employing chatgpt in cognitive practices. Such mindless use of chatgpt in creative writing does defeat the purpose of orgainsing such workshops. It does more harm than good to the cognitive, creative expression of students. If students begin to rely on external prosthetics for internal processes would have damaging effect. As a teacher, trying to train students to think, I would be glad if no technological support is sought. The student when he completed the task, did not bother that he did not do the thinking, he was unconcerned about the impact it would have on his most important skill critical and creative thinking. The student was in fact, in the beginning, trying to deceive, although playfully. The teacher's intervention was indeed an appropriate and timely act as it restored the level playing field among all other participants. Through the virtue lens, one could look at it as a matter of deception. While other students did not have their laptops, and when the teacher trusted him that he would 'just' write on the laptop, it violates an agreement to equality in the class room scenario. Perhaps, the teacher should be more proactive and sit with the individual students to offer help and prompt them into proper writing.
With the evolution of AI Models and its far reaching impact it seems to have in teaching and learning, particularly in literary studies, the challenge is to navigate practicality and fetish created by technology. As a teacher of literature, my question remains whether AI could be really beneficial in giving in depth understanding of the texts. Ebooks and other such things are fine since the format alone is different. But the sense of reliance on AI for analysing and interpreting a text would be self defeating. In my view, Human brain is far more superior and flexible than the AI. As a response, I would be more vigilant to see if students are using their devices promptly without giving sense of intruding. I would also have a debate about the importance of creative/critical thinking as a inner core and that it should not be helped on from external devices. Also, I would strive to create an awareness among students that thinking is a unique skill that needs to be developed individually.
This semester, I am teaching a course titled Applied English to second-semester students. The class is designed to help students develop their English language skills and apply them to real-life contexts, particularly in behavior and business management within the agricultural sector. Students are encouraged to use English to read, write, listen, and speak in ways that reflect agribusiness-related topics, with the long-term goal of enabling them to build internationally-scaled agricultural ventures. Two weeks prior to the Eid holidays in 2025, I assigned my students an essay using creative writing methods, based on their personal experiences after participating in Lasallian Formation-an initiation spirituality program for new students at our university. The objective of this task was to introduce various writing approaches that empower students to express their ideas creatively. Given the rising use of Generative AI, I allow its use in my class-limited to as long as it serves as a supporting tool to help students gain a better understanding of the material, discover relevant references for academic writing, and develop their skills and knowledge in alignment with their unique talents and interests. Long story short, after reviewing their submissions one by one, I noticed that two essays bore a striking similarity-about 90% match. This left me in a dilemma on how to fairly assess their work. To give some context, all assignments and learning modules in my class are facilitated through our LMS and Google Education Suite (Google Classroom). I firmly reject any form of plagiarism especially over reliance to Generative AI. Why? Because heavy reliance to Generative AI/other technologies will de-valuable the high values of critical thinking and the exquisite of how the human brain works. However, I also believe it would be unfair to make immediate judgments or accusations without understanding the full picture. So, I decided to turn the following class into an open discussion. I invited every student present to share their process-how they came up with ideas and how their writing evolved. One by one, they shared their stories. Then, one student’s story moved me deeply. She explained how, at the time the assignment was given, she was experiencing a severe mental health crisis[…]. This intense situation disrupted her focus and ability to complete her assignments. She admitted that she relied heavily on ChatGPT to write her essay-not out of laziness or dishonesty, but because she was overwhelmed, emotionally paralyzed, and unable to find clarity. She acknowledged that she had not handled the situation well and took full responsibility for the consequences. Still, she was transparent in her reflection and showed vulnerability in explaining her actions.
As a teacher, I found myself in an ethically complex situation that challenged me to balance academic integrity with empathy and humanity. When I discovered two student essays with a high degree of similarity, my initial concern was centered on upholding fairness and discouraging plagiarism. However, after facilitating an open class discussion, one student shared that she had relied on ChatGPT to complete her assignment due to a severe mental health crisis-she had been receiving threats to her safety, which left her emotionally overwhelmed and unable to focus. From a consequentialist perspective, I recognize that her use of AI may have helped her cope in a moment of distress, even though it risked undermining academic standards. While her approach did not align with our usual expectations, I believe her rights to safety, dignity, and support as a learner were more urgent in that moment. The situation also prompted me to reflect on the kind of learning environment I want to foster-one that values integrity, yes, but also compassion, courage, and open communication. From a justice standpoint, I felt it was essential to listen to each student's process and ensure that no one felt shamed or dismissed, while still affirming the importance of clarity and fairness for all. Ultimately, this experience reminded me that ethical teaching isn't just about enforcing rules-it's about understanding the whole person behind the assignment and guiding them toward responsible, reflective engagement with both their challenges and the tools they choose to use.
Reflecting on this situation, I realize that while different ethical frameworks offer varied insights, they converge on one important point: the need for both integrity and compassion in education. As a teacher, I am committed to upholding academic honesty, yet this experience reminded me that students bring their whole selves into the classroom—including their struggles. The consequentialist lens helped me understand the immediate benefit of the student’s actions in coping with distress, while the rights-based and justice perspectives emphasized my duty to create a safe, fair, and supportive learning environment for all. Most importantly, virtue ethics resonated with my role as an educator—not just to teach content, but to model empathy, patience, and reflective judgment. Moving forward, I am committed to revisiting and clarifying my class policies around the use of AI, not to penalize, but to guide students in using such tools ethically and creatively. I also feel called to strengthen open communication and build systems of support where students feel safe to ask for help before reaching a breaking point.
Technology is indeed a blessing which helped me in a challenging situation. It occurred in my department. Two course teachers handle a course and the work load is also shared. After the second internal assessment, we are supposed to enter the marks of the students. I did my part. but the other person forgot to enter the marks. Before the semester, she received a call from the examination office to that the marks are not entered. She tried to frame it on me. The next call was to the HOD raising complain on us. It was frame on me. So I reached the examination office and asked them if the software can show who had entered the marks. It did show my identity, date and time when then marks were entered. I was happy that the software didn't lie. This might happen among other professors too but I did not hear any honestly.
Yes it minimised harm. Anything will have its pros and cons, but in technology benefits weigh more. It respects as well as violates rules. Yes people are treated as a means to an end. There are elements of deception too much in humans than technology. It is our duty and responsibility to upload the assessment marks. It helps to realize our responsibility. Honesty should be practised when it comes to students marks and not taking thing for granted. The other person was listened to since she could lie with confidence. But technology unmasked the masked.
Virtue and justice frameworks were the most demanding in this situation. in response I just have to stay calm for the truth to come out instead of charging the other. Technology has both a sunny and dark side. Let us use it in a ethical way to build and support communities.
Enquiry: chtl@hkbu.edu.hk
Copyright © Hong Kong Baptist University. All rights reserved.