Sakana AI’s TreeQuest: Deploy multi-model teams that outperform individual LLMs by 30%
July 3, 2025 3:00 PM
Image credit: VentureBeat with ChatGPT
Want smarter insights in your inbox? Sign up for our weekly newsletters to get only what matters to enterprise AI, data, and security leaders. Subscribe Now
Japanese AI lab Sakana AI has introduced a new technique that allows multiple large language models (LLMs) to cooperate on a single task, effectively creating a “dream team” of AI agents. The method, called Multi-LLM AB-MCTS, enables models to perform trial-and-error and combine the...
Read more at venturebeat.com