A student at Northeastern University demanded a payment of her education after learning that her doctor had used AI programs like ChatGPT to produce course materials despite a program policy prohibiting the use of such technology inadvertently. When she noticed a directive addressed to ChatGPT in the report, Eleanor Stapleton, a freshman at the time, was reviewing the presentation information for her operational behavior group. The New York Times reported that the content contained statements like “expand on all areas” and showed typical signs of AI-generated information, including shaky terminology, distorted images, and even typos that reflected system production. According to Stapleton, who was quoted as saying” He’s telling us not to use it, and then he’s using it himself,” Stapleton was quoted as saying in The Times. She complained informally to the school’s business class about her writer’s unreported usage of AI and other issues with his teaching. She requested a refund of the course’s cost of more than$ 8,000. Rick Arrowood, the doctor, later admitted to using ChatGPT, Perplexity AI, and Gamma as a replacement for his previous supplies. In past, I wish I had examined it more attentively, Arrowood said. He also acknowledged that the AI-generated material was flawed and that he hadn’t used those elements in class discussions, which were held in person. Northeastern turned down Stapleton’s ask for payment after a number of meetings. According to a school director, the school “embraces the use of artificial knowledge to improve all aspects of its training, study, and operations,” and it enforces policies requiring attribution and accuracy checks when using AI-generated content. This event has sparked a wider discussion in higher learning, where students are increasingly voicing their disapproval of faculty for using AI tools. While some colleges forbid students from using ChatGPT and other similar resources in courses, professors are today facing investigation for doing the same. Some students argue that they are paying to be taught by people rather than techniques that they could use for completely. The issue with Stapleton is hardly unique. One student at Southern New Hampshire University discovered that their teacher had used ChatGPT to quality essays and create feedback, making the student feel “wronged” and eventually transfer to a different university, according to The Times. Paul Shovlin, an English teacher at Ohio University, acknowledged students ‘ concerns but claimed using AI to create presentations or information was equivalent to relying on published training helps. He continued to stress the importance of clarity, saying that “it’s the human contacts that we build with students… that add benefit.”
Trending
- Make a will if you travel to Venezuela: US issues strange travel advisory
- Poll: Most U.S. Voters Want Deportations to Continue
- REPORT: Democrat Congresswoman Who Assaulted ICE Officers to Face Federal Charges
- Here’s How Trump’s ‘Big, Beautiful Bill’ Will Help Americans
- Here’s What Would Happen If The Trump Administration Used The Comey Standard Against Comey
- Fresh H-1B row begins as MAGA freaks out over 120,000 visa approvals for 2026: ‘They should go home’
- ‘Fortnite’ Players Are Already Making AI Darth Vader Swear
- Democrat Leaders In House And Senate Refuse To Condemn Comey’s Incitement Of Violence Against Trump