Trend Lines: Managing Tech Advances in the Age of ChatGPT

Summer 2023

By David Cutler

power toolThis article appeared as "Power Tool?" in the Summer 2023 issue of Independent School.

As a newer teacher at Palmer Trinity School (FL) in 2010, I remember chatting with my then headmaster, Sean Murphy, about technology in the classroom. He gave me a copy of Neil Postman’s The End of Education: Redefining the Value of School. “It’s a little dated,” he told me, “but I think you might find some useful, interesting stuff in there.” I appreciated the book at the time, and upon revisiting it this past school year in light of the latest headlines, I’m again struck by the timeliness of a particular passage from it, first published in 1995, during the internet’s nascent stages. 

“The computer and its associated technologies are awesome additions to a culture, and they are quite capable of altering the psychic, let alone the sleeping, habits of our young,” Postman writes. “But like all important technologies of the past, they are Faustian bargains, giving and taking away, sometimes in equal measure, sometimes more in one way than the other. It is strange—indeed, shocking—that with the 21st century so close on our heels, we can still talk of new technologies as if they were unmixed blessings, gifts, as it were, from the gods.”

Today, Postman and Murphy continue to influence how I think about education and technology, particularly with recent advancements in artificial intelligence (AI), which Google defines as “a set of technologies that enable computers to carry out a range of advanced functions, such as being able to see, comprehend, and translate verbal and written communication; analyze data; make suggestions; and more.” 

After 16 years of teaching high school history and journalism, I know better than to blindly accept dire conclusions like those in The Atlantic’s recent article-gone-viral “The End of High School English.” In this December 2022 piece, the author writes about ChatGPT, a natural language processing system introduced in late 2022 by OpenAI that produces remarkably humanlike responses to various inputs, and how it “may signal the end of writing assignments altogether—and maybe even the end of writing as a gatekeeper, a metric for intelligence, a teachable skill.” 

Technology is a powerful tool, and it must always be approached with caution and care. Teaching about it is a challenging but important task. Certainly, we must warn students of its potential for misuse and exploitation, but we must also show them how to use it responsibly and creatively to unlock its true power for positive change.

The Evolution of Classroom Technology

When I think about ChatGPT and similar AI platforms, I can’t help but think about 1982’s Blade Runner, a powerful, futuristic movie. To identify advanced humanlike androids called replicants, an elite police force uses the fictional Voight-Kampff test to measure how subjects respond to emotionally charged questions. It’s that hard to tell a robot from a person. 

How quickly science fiction became reality. In 2014, a chatbot (a program designed to simulate conversation with a human user) duped one of three judges into believing that it was an actual 13-year-old boy named Eugene Goostman. This marked the first time that an AI creation passed the “imitation game,” developed by computer scientist Alan Turing in 1950 to evaluate a machine’s ability to provide replies indistinguishable from those of humans. The chatbot scored above 30%, just good enough to win the contest and make headlines.

Automated computer technology has been used in educational settings for far longer than we might think. The Scantron, developed in the early 1970s, simplified tasks such as grading assignments. Its widespread adoption by teachers speaks to its user-friendliness and ability to reduce tedious labor. AI algorithms to assess student understanding are more recent advancements. In fact, my high school students use InQuizitive, an adaptive learning tool that not only personalizes individual students’ learning paths but also uses analytics to help track progress in real time.

Even during my time as a high school student in the late ’90s and early 2000s, translation tools, graphic calculators, and other technologies were being used (and sometimes abused) to help save time and effort. Even then, educators needed to walk side by side with students to guide them toward proper and ethical use of technology.

Response Time

Fear is a natural response to advances in technology, especially those as seismic as AI. Still, this isn’t our first rodeo. Let’s go back to the calculator. 

In 1985, Casio revolutionized the personal calculator market with its FX-7000G, armed with a graphing function and the ability to quickly solve complex equations. That April, a New York Times article, “New Terrain in College Math,” reported on how Warren Page, a professor of mathematics at New York City Technical College, responded to the growing student use of calculators. 

“Problems which can be posed but not solved by a calculator are effective for demonstrating to students that their head-held calculator is much more powerful than their hand-held calculator,” the article quotes Page as saying. “Although calculators can be helpful for computing, they should not be antidotes for the headache of having to think.” As a quick but telling thought experiment, reread Page’s quote, but swap out “calculator” with “AI” or “ChatGPT.” 

Like Page, I remind my students that their minds are far more wonderous and powerful than any AI. I also tell them that nobody ever got stronger by watching somebody else work out, and the same principle holds true for exercising the brain. No matter how powerful AI becomes, it will never be able to spew a response that automatically makes students better critical thinkers and kinder, more compassionate individuals. As seen throughout history, only time, hard work, and resilience can do that; as educators, we must nurture this process, which, our fast-paced culture notwithstanding, is often slow and arduous—not instantaneous, or nearly so, like the speed of AI. Patience, after all, is a virtue. 

If teachers should fear anything about AI, it’s students being unable to spot inaccurate, damaging information. This is because AI is exceedingly good at presenting writing that looks convincing but may actually be false. This raises the stakes for educators, who must ensure that students understand how to determine veracity. To do this, teachers must provide students with a thorough understanding of critical thinking skills, which AI cannot adequately foster (at least not yet).  

Despite the possibility that students will use ChatGPT to write their papers or portions of assignments, I don’t believe that AI will contribute to a massive spike in cheating—even with segments of the media obsessed with pushing that angle. In an independent school setting, with fewer students per section than our public-school counterparts, teachers should be even more familiar with each student’s strengths and weaknesses. As with cases of copy-and-paste plagiarism, an AI-generated submission should stand out like a sore thumb. 

To get to know my students’ abilities and to meet them where they are, I have always given in-class essay tests before assigning larger take-home assignments. I might double-down on this decision moving forward, thanks to ChatGPT, but I remain optimistic about student integrity. 

Of course, I’m not naïve; unfortunately, students will cut corners (and likely already have). But I also believe that students who abuse AI would have found another way to cheat without it. ChatGPT might make cheating quicker and easier but not necessarily more enticing. I really want to believe in my students and their ability to make wise decisions. 

Open Doors

Since recent advances in AI are still so new, it’s hard to know how it can be an effective learning tool. Teachers should continue to experiment with ChatGPT in their classrooms; let students know they’re aware of what AI can now do, what it still struggles with, and that they are willing to explore its potential and development with them.

This spring, I used the overhead projector during class to showcase how Jasper—another AI platform—responded to my American history essay prompts. The students were amazed by its accuracy yet critiqued its prose and sentence structure as “lifeless” and “dull.” They also pointed out that some of the quotes included could not be verified with a Google search. 

Educators should ask more questions before arriving at any definitive answers or policies. We need a year or more to figure out what works and what doesn’t. On this point, I think that Mr. Murphy, wherever he is, would agree with me. I also hope that he is proud of how I am tackling the intersection of technology and pedagogy. Most of all, I hope that he still considers his decision to hire me, then an inexperienced, impressionable 23-year-old, as wise and sound. 
David Cutler

David Cutler is a history and journalism teacher at Brimmer and May School in Chestnut Hill, Massachusetts.