To Make Language Models Work Better, Researchers Sidestep Language | Quanta Magazine
Language isn’t always necessary. While it certainly helps in getting across certain ideas, some neuroscientists have argued that many forms of human thought and reasoning don’t require the medium of words and grammar. Sometimes, the argument goes, having to turn ideas into language actually slows down the thought process.
Now there’s intriguing evidence that certain artificial intelligence systems could also benefit from “thinking” independently of language.
When large language models (LLMs) pro...
Read more at quantamagazine.org