AI in the Classroom: Helpful Study Tool or Shortcut to Nowhere? Teachers and Students See It Differently


(Image by Gerd Altmann from Pixabay)

By Anushka Devanathan

The number of U.S. teens using AI to complete homework has doubled in just one year, according to a new Pew Research study — a shift that’s raising alarms for teachers in local classrooms.

As artificial intelligence tools like ChatGPT become more accessible to students, educators are growing increasingly concerned about their role in schoolwork — and whether they threaten students’ ability to think critically.

According to Pew, usage of AI for schoolwork jumped from 13% in 2023 to nearly 26% in 2024. Awareness of tools like ChatGPT also increased, rising from 67% to nearly 79% over the same period. The growing awareness of ChatGPT appears to have led a greater number of teens to think it is OK to use AI for schoolwork.

>>Commentary:

‘Amazing and Scary All at Once’: How Teens Feel About A.I.<<<

ChatGPT — and other AI tools like Google’s Gemini, Meta Llama and Microsoft Copilot — is developed using a large language model that tries to predict and generate plausible language. LLMs are trained through the collection of data, conversations, texts, etc., from the internet and other sources. As users continue to chat with it, ChatGPT tailors its responses based on one’s input and feedback. These tools also have other capabilities, such as analyzing data and images, browsing the web, and generating images, giving students many ways to use them.

While AI tools have found a foothold among teens, the reasons for their use vary. High school sophomore Arshia Mittal, who attends Granada High School, said she turns to AI specifically when struggling with an assignment.

“Using AI for schoolwork is cheating if you have AI do the entire assignment for you,” Mittal said. “If you just need clarification on something, then it might be fine to ask AI.”

Some students also prefer using AI as a source of active practice instead. Using active learning and user input, AI tools can generate questions or explain concepts. Based on the user reaction, it learns and adapts its responses accordingly. Its access to different platforms pulls up multiple resources that students could use as practice.

“AI has helped me with challenging subjects by giving me practice problems or questions as well as helpful information on the topic,” said Taryn Izuhara, a sophomore at Dougherty Valley High School. “I use AI to get prompts for AP World or multiple choice questions so I can study for an upcoming test.”

However, educators remain wary. At DVHS, educators are concerned that students may become dependent on a tool that’s not entirely reliable, which is one of the leading downsides about AI. AI models learn from LLMs, which accumulate data from the internet. However, the internet is filled with so-called facts that are incorrect or misleading, which chatbots simply repeat back to us. This leads to misinformation, plagiarism and fabrication.

“Students are going to lose their ability to think critically,” said Marie Ver Haar, who teaches ninth and 10th grade English.

“The problem is that students who use AI are typically doing it because they want to take a shortcut,” said Ver Haar. “They’re not doing it because they want to see if AI could have said better what they wanted to say in the first place.”A common issue with AI chatbots is a behavior called “hallucination.” According to the New York Times, a study found that at least 3% — and as high as 27% — of AI make up information. This is largely due to unrepresentative or absent data. Unable to find a source, LLMs instead use any data to answer the question, often distorting facts.

On the other hand, school districts are turning to AI as an aid in student learning. In the fall, San Ramon Valley Unified School District will implement Google’s AI tool, Gemini, on school-issued computers. As of right now, the websites are currently blocked on school Wi-Fi or guest devices. According to the SRVUSD website, the goal is for teachers and educators to integrate AI into teaching practices and promote efficiency.

Still, not all educators are optimistic about the rollout.

“[I’m] absolutely against it,” said Ver Haar. “My understanding is that when they do make it available to students, I can control their access to it. However, I think I can only control their access to it through my Google Classroom. So I can’t really control their access.”

Teachers’ lack of control on Gemini is similar to how teachers are unable to view student coursework through Google Classroom as well. However, the question of whether AI was used goes unanswered as teachers are grading schoolwork.

“It [AI] dilutes the integrity of the course, and if I can’t validate if the student’s grade is true and accurate, then it invalidates the grade I give them.”

While AI tools have the potential to enhance learning, some teachers seem more concerned with the drawbacks rather than exploring how chatbots could be productively integrated into the classroom.

“Benefits apply to the people who can use it with scrutiny and understand where those inaccuracies fall,” said Ver Haar. “When we’re talking about students, students don’t necessarily know that the information they’re being given is wrong.”

“Not only do [students] not get points for what they’ve submitted,” she added, “they haven’t learned anything.”

Tags:
No Comments

Post A Comment

Enjoy our content?  
SIGN UP FOR OUR NEWSLETTER
JOIN TODAY
close-image