
Researchers cautioned that chatbots would lead to a severe problem shortly after relational artificial intelligence became popular. Conspiracy theories would spread ferociously as disinformation became more prevalent.
Now, scientists wonder if bots may also provide a solution. DebunkBot, an AI robot designed to “very properly persuade” users to prevent believing baseless conspiracy theories, made substantial and long-lasting development at changing people’s convictions, according to a study published Thursday in journal Science. The new studies challenge the commonly held notion that facts and logic cannot disprove conspiracy theories. The DebunkBot, built on the systems that underlies ChatGPT, may provide a useful way to channel information.
No amount of explaining, according to conventional wisdom, could have prevented someone from leaving the secretive rabbit hole when she fell.
According to Thomas Costello, co-author of the research and assistant professor of psychology, conspiracy theories were used to soothe an actual need to describe and manage their environment.
However, Costello and his associates wondered if there might be another explanation. What if rebuttal efforts were n’t personalized enough? A one-size-fits-all debunking script is n’t the best approach because conspiracy theories differ from person to person and each person may cite different evidence to support their ideas. They speculated that a chatbot that you counter each person’s secretive claim with vast amounts of data might be much more efficient.
To check that assumption, they asked over 2, 000 people to expound on a crime they believed in, and gave them ratings ranging from 0 to 100. Finally, some respondents had a brief conversation with the robot.
One participant, for example, believed the 9/11 terrorist attacks were an “inside job” because jet fuel could n’t have burned hot enough to melt the steel beams of World Trade Center. The robot responded:” It is a common misperception that the material needed to melt for the buildings to collapse”, it wrote. Steel begins to lose strength and reduce pliableness when temperatures drop below the melting point, which is around 2,500 degrees Fahrenheit.
After three markets, which lasted eight hours on average, members rated how they felt about their views again. On average, the scores dropped by about 20 %, about one-fourth of participants no longer believed the deception.
The creators are looking into how to re-create this result in the real world. They have thought about buying ads that appear when someone searches for a typical theory, or linking the bot in communities where these values are shared. nyt