I remember the first day one of my friends told me about ChatGPT. My farm-kid brain thought Artificial Intelligence meant something a little freakier, so I was amazed to hear about this text bar where I could type anything in and get what I wanted instantly. We screwed around with it so much, asking it to write movie scripts and put our friends in reality TV shows. But then I remember even more clearly the first time one of my friends got caught using it on an English exam. That brought up a serious question.
How do we manage such a useful yet critical-thinking-killing tool?
All of our syllabi at Washington and Lee have AI sections. What surprised me most were the stark differences in AI policies across departments. In every one of my classes, the use of AI differs, ranging from being able to use it on a midterm to absolute zero AI tolerance. How does each department view this valuable tool?
In my English class, there’s a blanket no-AI policy with some rather harsh and extremely specific scenarios:
“You may not use generative AI tools—including Large Language Models like ChatGPT, Gemini, and Copilot—for any assignments in this course. This includes copying text generated by AI that paraphrases or summarizes sources; using AI to adjust the tone or style of your writing; using AI to summarize or analyze assigned readings; using AI to craft arguments or responses; using AI to brainstorm ideas or topics; or using AI to generate, revise, polish or correct any writing you submit as your own. A breach of this policy will be treated as plagiarism”
I could analyze this paragraph in my English class. The short, choppy form creates a sense of anxiety and worry. Anyway, I think this is fair for an English class. In a field built on originality, in-depth analysis and critical thinking about others’ work, AI shouldn’t be permitted. I understand the argument of using it for summarization, notes or brainstorming — it can make the prep work for writing easier. But that still takes away from the originality and critical thinking English calls for. It’s about emotion and your own ideas. If we all used AI for these assignments, everything would look the same. Where’s the fun in that? Where’s the fun for your English professor who wants to grade some original, cool stuff?
This brings up a broader question about AI in literature. It can make editing more efficient, writing easier and idea generation quicker. But will we abstain from using it in creative fields? Or will we get lazy and let it create our entertainment? AI is about efficiency, but does it replace or even just complement creativity? I think not.
In my Latin class, it’s another no-AI policy, but this time with a twist:
“The unattributed use of generative AI tools on any assignment in this course may constitute an Honor Violation. If you believe that the attributed use of generative AI may aid you or your classmates in fulfilling the course objectives, please consult with me before proceeding.”
Again, I think this is a fair policy. Foreign and classical languages have been fighting off online helpers for longer than any other subject has (Google Translate ain’t too shabby), and any tool that translates for you takes away the purpose of learning a language. Learning a new language is supposed to train your brain to be able to recognize and communicate instantly, not to whip out your phone and translate it on the spot. But we can, and it works rather well. So does this make learning a language more and more antiquated? Should we be teaching it in school now? Maybe it’s time to teach proper AI prompting instead of how to translate Latin paragraphs.
In my business class, we see a complete reversal of these prior ideas on AI. We use AI in full for every assignment. From writing memos to drafting emails, we learn how to prompt ChatGPT more than we learn how to write ourselves. For me, that makes it fun and interesting. But overall, I think this is another solid AI policy. Business is all about efficiency, profit and success — by any means necessary. Why wouldn’t you use a tool that cuts costs? That simply wouldn’t be good business. This raises yet another question: Does AI level the playing field for businesses? Does originality lead to higher profit, or does efficiency? What I don’t know — but am curious about — is which model will win: heavy AI usage and investment, or better-trained and critical-thinking employees who use AI only for detailed work.
What you can probably tell from all of this is that nobody, including I, knows what the future of academic AI holds. I can’t predict what it will mean for teaching languages, the businesses of tomorrow or literature. But I do know that W&L is managing it well in each department — for now. It would be cutting-edge if we leaned into AI, teaching students how to use it and showing them how it fits into the real world. If we do that, our already well-prepared students are going to be light years ahead. It’s hard to get that balance right, though, and I’m not sure a classical liberal arts school will jump that far into the future. Whatever — I’m just going to avoid using it where it says “no” so I don’t get expelled. Or maybe I won’t.
