News Score: Score the News, Sort the News, Rewrite the Headlines

Hallucinations in code are the least dangerous form of LLM mistakes

2nd March 2025 A surprisingly common complaint I see from developers who have tried using LLMs for code is that they encountered a hallucination—usually the LLM inventing a method or even a full software library that doesn’t exist—and it crashed their confidence in LLMs as a tool for writing code. How could anyone productively use these things if they invent methods that don’t exist? Hallucinations in code are the least harmful hallucinations you can encounter from a model. The real risk from us...

Read more at simonwillison.net

© News Score  score the news, sort the news, rewrite the headlines