An LLM Query Understanding Service
We need to be cheating at search with LLMs. Indeed I’m teaching a whole course on this in July.
With an LLM we can implement in days what previously took months. We can take apart a query like “brown leather sofa” into the important dimensions of intent — “color: brown, material: leather, category:couches” etc. With this power all search is structured now.
Even better we can do this all without calling out to OpenAI/Gemini/…. We can use simple LLMs running in our infrastructure making it faster ...
Read more at softwaredoug.com