When Bryn A. Williams, an English teacher at Burnaby Central High School in Canada, gave his class a creative writing assignment, he expected a range of imaginative responses with distinct personal voices. Instead, as he began reading through the submissions, something immediately felt off. Nearly every single essay sounded eerily similar, almost as though they had all been written by the same person. It did not take long for the 52-year-old teacher to figure out the reason — virtually his entire class had used AI to complete the homework, as reported by Newsweek.
The assignment itself was genuinely compelling. Williams had asked his students, all aged 16 and 17, to describe “a day in the life in a dystopian high school.” The class had recently studied ‘Animal Farm’ and watched the film ‘V for Vendetta’, which had sparked rich classroom conversations about totalitarianism, authoritarianism, and what it feels like to live without freedom. Williams wanted the students to take an environment they knew intimately, their own school, and reimagine it as a place stripped of safety and personal liberty. It was the kind of prompt that invites genuine reflection and creativity.
The problem became clear almost immediately. As Williams read through the papers, the opening lines were practically interchangeable from one essay to the next. Alarms going off at six in the morning, robotic announcements over intercoms, slogans about obedience plastered on the walls, wristbands tracking students’ every movement. When Williams typed the same prompt into ChatGPT himself, the output was nearly identical to what his students had handed in. The pattern was undeniable. You can watch video here.
“Of course I was disappointed,” Williams said. “But already after the second essay I read, the similarities were obvious and a bit creepy. The same number of paragraphs, almost all had 10, and all ended with a two-sentence paragraph. All followed the exact same daily timeline, just with small variations.” The structural uniformity made it clear that the students had not simply been inspired by similar ideas but had outsourced the entire creative process to the same tool.
Williams was candid about his own role in the situation. Around half the class had asked if they could write their essays at home on their computers and email them in, rather than completing the work by hand during class time. He agreed, and that decision opened the door to AI use. “Should have asked for drafts they worked on in class, but I trusted my students,” he acknowledged. It was a moment of honest self-reflection from an educator grappling with a challenge that teachers around the world are increasingly facing.
@my_messy_desk teacher content be like:
♬ original sound – My Messy Desk
In response to the experience, Williams overhauled his approach. Today, roughly 90 percent of assignments in his classes are completed during class time, written on paper. He noted, however, that even this is not entirely foolproof, since tools like Google Lens make it possible to photograph handwritten text and process it digitally. The cat-and-mouse dynamic between educators and AI tools shows no sign of slowing down, and teachers like Williams find themselves constantly adapting.
Williams shared the story in a TikTok video posted to his account, @my_messy_desk, and the clip racked up more than 756,000 views before being taken down. The comment section was a battleground of opinions. “Sounds like a really fun assignment — why would they even want to use ChatGPT for that? Don’t kids have imagination anymore?” one commenter wrote, while another simply declared that “creativity is dead.” Others pushed back, arguing that AI is just the latest tool in a long line of technological shortcuts, no different in principle from students reaching for calculators instead of working through math problems by hand.
The debate touches on something much larger than one classroom in Canada. For decades, educators have wrestled with how technology changes the nature of learning and what it means to truly engage with a task. The arrival of powerful, freely accessible AI writing tools has accelerated that conversation dramatically. What makes the Williams story particularly striking is the subject matter itself, a creative exercise designed to build empathy and critical thinking about oppressive systems, handed off to a machine that has no experience of either.
ChatGPT, developed by OpenAI and launched publicly in late 2022, became one of the fastest-growing consumer applications in history, reaching 100 million users in just two months. It is built on a large language model trained on vast amounts of text data and is capable of producing fluent, structured writing on virtually any topic. That fluency is exactly what makes it so tempting for students under deadline pressure and so difficult for teachers to detect. Academic integrity policies at schools and universities worldwide have been scrambling to catch up ever since the tool went mainstream. ‘Animal Farm’, the novella by George Orwell published in 1945, is one of the most commonly taught texts in high school English classes precisely because it encourages students to think critically about power and conformity. The irony of students using a conformity-producing AI to write about dystopian conformity was not lost on many of those who commented on Williams’ video.
What do you think about students using AI for creative assignments, and where should the line be drawn between assistance and academic dishonesty? Share your thoughts in the comments.




