Uses for local AI?
Uses for local AI?
Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.
Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?
You're viewing a single thread.
Ollama without a GPU is pretty useless unless you're using with Apple silicon. I'd just get rid of it until you get a GPU.
2 4 ReplyWorks fine on an 11th Gen i5. Not fast but not slow
7 0 ReplyI have never tested in on Apple silicon but it works fine on my laptop
1 0 ReplyWhat are your laptop specs?
1 0 ReplyIntel 12th gen i5
1 0 ReplyCPU is only one factor regarding specs, a small one at that. What kind of t/s performance are you getting with a standard 13B model?
2 0 ReplyI don't have enough ram to run a 13b. I just stick to Mistral 7b and it works fine.
1 0 Reply