A student at Northeastern University demanded a payment of her education after learning that her doctor had used AI programs like ChatGPT to produce course materials despite a program policy prohibiting the use of such technology inadvertently. When she noticed a directive addressed to ChatGPT in the report, Eleanor Stapleton, a freshman at the time, was reviewing the presentation information for her operational behavior group. The New York Times reported that the content contained statements like “expand on all areas” and showed typical signs of AI-generated information, including shaky terminology, distorted images, and even typos that reflected system production. According to Stapleton, who was quoted as saying” He’s telling us not to use it, and then he’s using it himself,” Stapleton was quoted as saying in The Times. She complained informally to the school’s business class about her writer’s unreported usage of AI and other issues with his teaching. She requested a refund of the course’s cost of more than$ 8,000. Rick Arrowood, the doctor, later admitted to using ChatGPT, Perplexity AI, and Gamma as a replacement for his previous supplies. In past, I wish I had examined it more attentively, Arrowood said. He also acknowledged that the AI-generated material was flawed and that he hadn’t used those elements in class discussions, which were held in person. Northeastern turned down Stapleton’s ask for payment after a number of meetings. According to a school director, the school “embraces the use of artificial knowledge to improve all aspects of its training, study, and operations,” and it enforces policies requiring attribution and accuracy checks when using AI-generated content. This event has sparked a wider discussion in higher learning, where students are increasingly voicing their disapproval of faculty for using AI tools. While some colleges forbid students from using ChatGPT and other similar resources in courses, professors are today facing investigation for doing the same. Some students argue that they are paying to be taught by people rather than techniques that they could use for completely. The issue with Stapleton is hardly unique. One student at Southern New Hampshire University discovered that their teacher had used ChatGPT to quality essays and create feedback, making the student feel “wronged” and eventually transfer to a different university, according to The Times. Paul Shovlin, an English teacher at Ohio University, acknowledged students ‘ concerns but claimed using AI to create presentations or information was equivalent to relying on published training helps. He continued to stress the importance of clarity, saying that “it’s the human contacts that we build with students… that add benefit.”
Trending
- China’s first police Corgi has 400,000 followers and a nose for trouble
- FBI director issues major warning after foiling mass shooting at US military base
- Oxford Professor Says WWII’s Enigma Code ‘Wouldn’t Stand a Chance’ Against Today’s AI
- 11% of Columbia Library Arrestees Use ‘They/Them’ Pronouns
- Meta Delays Its Next Big AI Launch – Again
- CoreWeave’s Massive $23B AI Spending Plan Worries Investors Despite OpenAI Deal
- Report alleges abuse, rights violations at El Paso processing center
- Report alleges abuse, rights violations at El Paso processing center