Local LLM inference - Amir Zohrenejad - Medium
Tremendous progress, but not ready for productionI stumbled into local inference during a side quest. The main mission was to build a text-to-SQL product. Curiosity hijacked the roadmap: could I cram an entire GenBI stack into the browser?The prototype never shipped. But I did fall into the local inference rabbit hole. Though “AI inference” is not a listed feature on my laptop spec sheet — through the magic of open source software it can now run powerful LLMs in its browser tabs for free. It’s i...
Read more at medium.com