News Score: Score the News, Sort the News, Rewrite the Headlines

Asking chatbots for short answers can increase hallucinations, study finds | TechCrunch

Turns out, telling an AI chatbot to be concise could make it hallucinate more than it otherwise would have. That’s according to a new study from Giskard, a Paris-based AI testing company developing a holistic benchmark for AI models. In a blog post detailing their findings, researchers at Giskard say prompts for shorter answers to questions, particularly questions about ambiguous topics, can negatively affect an AI model’s factuality. “Our data shows that simple changes to system instructions dr...

Read more at techcrunch.com

© News Score  score the news, sort the news, rewrite the headlines