LocalLLaMA @sh.itjust.works Timely_Jellyfish_2077 @programming.dev 1y ago Localllama setup for $100k. Consider this hypothetical scenario: if you were given $100,000 to build a PC/server to run open-source LLMs like LLaMA 3 for single-user purposes, what would you build?