The Infiltration of AI in Education
Educators scramble to adapt with new and developing technologies, reworking the classroom and prompting a change in formal education for years to come.
Initially established in the 1950s, artificial intelligence (A.I.) has grown exponentially in recent years. This cutting-edge technology “performs tasks typically requiring human intelligence, such as visual perception, speech recognition, decision-making, and natural language processing” (ChatGPT). This goes one step further than previous input/response generators, as people no longer have to curate their searches or comb through websites and articles to find information.
Intelligence systems such as Alexa, social media algorithms, Tesla’s autopilot, and on-site chatbots have become integrated into almost all aspects of day-to-day life. Yet, the advancements of this technology have seemingly flown under the radar for quite some time.
It wasn’t until the launch of ChatGPT (Chat Generative Pre-trained Transformer) in November 2022 that A.I. finally caught mainstream attention. “We’re always interested in new technologies and how they integrate with education,” states Marc Ebenfield, Director of the Center for Excellence in Teaching and Learning at the University of New England (UNE).
ChatGPT is a cutting-edge language model designed and developed by OpenAI, one of the leading research organizations in the field of artificial intelligence. The original version of ChatGPT, released in 2018, was trained on a dataset of over 40 gigabytes of text data. More recent versions, such as GPT-3, have been trained on a dataset of over 570 gigabytes of text data. This massive amount of training data enables the program to generate text that is not only coherent and grammatically correct but also closer to human-like language than previous models.
Along with ChatGPT, sites such as Caktus ai and Elicit offer free programs to write and analyze any content without worries of plagiarism. Essay AI Lab, for example, assures students that “if your essay will go through a plagiarism checker such as Turnitin, don’t worry. Essayailab rephrases sufficiently so that your essay differs enough from any existing article.”
These companies acknowledge that students are under immense pressure; their grades, degree, and, subsequently, futures ride on their work output. A.I. technology helps ease this stress by writing essays, answering homework, and paraphrasing what should be thoughtfully understood. In many ways, these new programs have given students a way to relieve the stress of their current state without preparing them for any future state.
Sites such as AI Essay Writer ironically advertise that they will “upgrade your writing skills” (AI Essay Writer) or that they are “trusted by over 250,000 writers” (AI Essay Writer). Lending to the worry that new artificial intelligence may undermine the concept of writing or being a writer.
Understandably, faculty at colleges and universities nationwide have many concerns about this new technology. “You know, the problem with machine learning is that it’s presumably going to get a lot better really fast, and there’s long been anxiety about new technology, like, ‘what’s that going to do to education?’” says Jonathan DeCoster, Assistant Teaching Professor of History and the Professor of Science, Technology, and The Modern World at UNE.
According to UNE’s Academic Integrity Policy, specific uses of technology are constrained in an academic setting. Although A.I. is not explicitly mentioned, it is prohibited under the first description of Academic Dishonesty: “receiving of unauthorized assistance or information” (Academic Integrity Policy). As well as in the definition of plagiarism: “plagiarism: the appropriation of records, research, materials, ideas, or the language of other persons or writers and the submission of them as one’s own” (Academic Integrity Policy).
Section H of the Student Handbook also bans some uses of A.I., stating that “the use of materials or devices during academic work, test, quiz, or other assignment which are not authorized “ (pg 59) is considered plagiarism. “Substituting for another person, or permitting another person to substitute for oneself, to complete academic work” (pg 59) is also defined as plagiarism and encompasses the use of certain technologies.
“Education should be, for lack of a better term, painful because you learn how to put a positive spin on it, “ says Karl Carrigan, Center Coordinator in the Center for Excellence in Teaching and Learning at UNE.“ It’s not just a paper at the end of the day; it’s the pathway of experiences that brings us to where we need to be.”
However, it is vital to note that the educational experience has changed drastically over recent years. Being immersed in various forms of remote learning during the Covid pandemic established a strong relationship between education and technology. Now, at the beginning of a post-Covid environment, this acquired connection remains prominent in the ways students collect and interpret information.
“Being in the classroom for only two days a week caused me to use more online resources for learning. I was definitely more open to using them when I was remote versus when I was in the classroom,” says Elia O’Hara, Sophomore English and Political Science Double Major.
Many people question whether this accessibility has made modern-day students too reliant on artificial intelligence. “Yes, technology is very useful in the classroom. But when is it too much? When does it actually inhibit learning as opposed to complement learning?” says Carrigan.
Increasing internet accessibility is forcing educators to ask, “What’s the point? What is the point of this course? What is the point of education? What is the point of this assignment?” If the point of the assignment is easily copied or easily replicable, then maybe I need to reconsider what the point is,” questions DeCoster.
Despite mounting concerns, many people feel confident that there is no technology which could unequivocally replace the nuances of human writing.“It’s more skilled than most students at writing coherent sentences and paragraphs, but it’s not skilled at having a lot of thought or meaning in those paragraphs; it’s just recycling words and phrases,” says DeCoster. “The skills I’m teaching are still valuable; they’re not completely replaceable by A.I..”
“Any technology that comes up will probably cause a stir like this. We’ve definitely seen it before, Google was supposed to make things too easy for students, and now we’ve implemented it as a major tool,” says O’Hara.
In fact, some teachers have decided to embrace A.I. as a classroom tool. “There are those who say that education is all about the transfer of knowledge, and technology helps that transfer.,” explains DeCoster. It appears that these new programs have become a one-stop shop for any educational aid, “so obviously, it’s a tool that has humongous potential,” says Ebenfield.
“I feel like it might be helpful if you are brainstorming ideas or looking at topics. You could almost view it as talking to another peer, bouncing ideas around, and seeing what comes back,” says O’Hara.
Nonetheless, many educators are taking an indifferent stance toward new A.I. technologies.
“Students have always cheated. The big thing about something like ChatGPT is that it makes it free and accessible to everyone,” says DeCoster.
With accessibility coming to the forefront of the issue, it is crucial to consider the “research on cheating, which shows that the amount of cheating hasn’t changed over time. A small percentage of students cheat a small percentage of the time,” says Ebenfield. In fact, many educators are more concerned about the effects of plagiarism than the act itself; the worry is that students are not genuinely learning and growing.
With the repercussions of ChatGPT still in question, universities nationwide are being forced to ask tough questions about their future. Yet, the conversation always leads back to this: room for growth is essential to a meaningful education, whether A.I. is integrated or not.
Educators must “work in the classroom to create an environment that is supportive and empathetic; one that encourages risk-taking and the comfortableness that comes with being wrong, or learning how to be wrong,” says Carrigan.
*As part of the research for this article, Hedegard used various artificial intelligence programs for assistance. Below is a complete list of the A.I. used to assist in creating this piece and their corresponding roles:
- Otter ai– Used to transcribe recorded interviews.
- ChatGPT– Wrote one paragraph (can you guess which one? Comment Below) and provided minimal background information.
- Grammarly- Corrected all punctuation, spelling, and grammar errors.*