How I run LLMs locally
A HN user asked me0 how I run LLMs locally with some specific questions, I’m documenting it here for everyone.Before I begin I would like to credit the thousands or millions of unknown artists, coders and writers upon whose work the Large Language Models(LLMs) are trained, often without due credit or compensation.Get Startedr/LocalLLaMA subreddit1 & Ollama blog2 are great places to get started with running LLMs locally.HardwareI have a laptop running Linux with core i9 (32threads) CPU, 4090 GPU ...
Read more at abishekmuthian.com