
ChatGPT at SAIS
By: Rowan Humphries
Edited By: Alexandra Huggins
One year on from ChatGPT’s public release, students and professors continue to grapple with AI use in the classroom.
As we stand at the threshold of a new era in education, the once-distant concept of Artificial Intelligence has become an integral part of our classrooms. The traditional chalk-and-board model is rapidly giving way to a more dynamic, technology-infused learning environment.
Would it surprise you to learn that the paragraph you just read was not written by me, but by AI-powered chatbot ChatGPT in response to the prompt: “write an introductory paragraph to a student newspaper article about AI use in the classroom”? Probably not. A year on from the model’s explosive public debut, most students have tested out the seemingly incredible capabilities of ChatGPT and other similar large language models to augment their work inside and outside the classroom. Some SAIS professors have even required its use on assignments. While many find AI tools helpful to brainstorm ideas for assignments or find resources, overreliance raises concerns about critical thinking skills and academic integrity.
As of now, Johns Hopkins University has not articulated a specific university-wide policy to address generative AI technologies, with administrators deferring to professors to institute their own policies and provide guidance on how the technology fits into their classes and assignments.
“For now, generative AI (including ChatGPT) falls under the same policies as all Honor Code Violations and overall academic honesty,” said Julie Micek, the Assistant Dean for Academic Affairs at SAIS. “Right now there is no university-wide policy specific to generative AI. We are in the process of exploring further to see if we (SAIS) will add its own language specific to this.”
A versatile tool
According to a poll I sent out in the SAIS student body Signal group chat, 59% of respondents reported they use AI tools “occasionally,” to augment their schoolwork, 35% said they didn’t use it at all, and 6% said they used it “very frequently.” Several students who responded to the poll agreed to discuss the specifics of their AI use further with me, but many felt uncomfortable being quoted by name for this article, and asked to remain anonymous.
One such student, who reported they use AI tools “occasionally,” said they use ChatGPT as a starting point for research and to automate some of the more repetitive aspects of academic writing, such as identifying relevant scholars, models and theories. “The fact that I don’t have to write the summary of the academic landscape myself saves me so much time that I can instead use for all the more interesting parts, the parts that AI definitely can’t do, which is doing the actual research,” they said.
Another student that wished to remain anonymous said they use AI tools to help them brainstorm ideas for assignments, but treat ChatGPT “as if it was another student in the class that was very well informed yet could be confidently wrong.”
Naomi Grant, a second-year MAIR student who said she uses AI tools “frequently” to augment her schoolwork, told me she primarily uses a paid program called UPDF to summarize her readings. “Not only does it summarize PDFs, but you can also ask questions about the text, like ‘What are the most important points you didn’t already mention?’” she explained.
While many students make use of AI tools to narrow down ideas for essays or focus their research, other students have used AI tools to assist with quantitative work. “I heavily rely on GPT and plug-ins to supplement my stats and econ readings,” said Ian Ching, a first-year MAIR student. “I find it’s usually highly accurate in mathematical outputs, and can pretty easily convert a written question to a math one,” he explained. “I liken the AI to a math tutor.”
Some students have even been required to grapple with these tools in the classroom. Robert Sumner, a DIA student, described to me how his Foundations of Cybersecurity class, taught by Professor Thomas Rid, is exploring how large language models like ChatGPT and Claude can “help bridge technical gaps” and “provide foundational technical knowledge that students will need in the real world.”
Potential risks
During a September 19 live virtual briefing hosted by Johns Hopkins University on AI’s effects on education, Professor Rid emphasized that “ChatGPT is a superpower…in the classroom, and like power in general, it can either be used for good or for bad.”
When asked “Do AI chatbots harm our ability to think critically?” ChatGPT told me (somewhat defensively) that the matter was “a subject of debate and ongoing research….AI chatbots can both support and potentially hinder critical thinking, and their effect largely depends on how they are implemented and integrated into educational or problem-solving settings.”
“I think that it’s a new enough technology that I’m not sure yet how much it’s a constructive tool that’s helping students learn – I see ways in which it is – and how much it’s a shortcut that’s getting in the way of skill-building and learning,” said Professor Adam Szubin, when asked how he is approaching AI in the classroom.
In regard to her classroom policy on AI tools, Professor Monica Lopez-Gonzalez, who teaches a SAIS course on the science, ethics and politics of AI, said she sees the tool as something that should be questioned, dissected and challenged. “It’s really about how do we understand it better, and you’re not going to understand it better if you don’t utilize it,” she explained.
Professor Lopez-Gonzalez also noted that a heavy reliance on these tools may lead to superficialities in student understandings of topics discussed in the classroom. “It [AI overuse] is having an effect, and you can really see who understands when you push further.”
As students and professors alike continue to experiment with new applications for AI-powered programs, these tools’ long-term effects for higher education still remain to be seen. As Professor Szubin put it, “to me it’s something that I want to talk to my colleagues about, I want to talk to my students about, and my best guidance for this semester has been to follow your ethical compass when it comes to how and if you use the tool.”