Skip Navigation

Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone

tl-dr

-Can someone give me step by step instructions (ELI5) on how to get access to my LLM's on my rig from my phone?

Jan seems the easiest but I've tried with Ollama, librechat, etc.

.....

I've taken steps to secure my data and now I'm going the selfhosting route. I don't care to become a savant with the technical aspects of this stuff but even the basics are hard to grasp! I've been able to install a LLM provider on my rig (Ollama, Librechat, Jan, all of em) and I can successfully get models running on them. BUT what I would LOVE to do is access the LLM's on my rig from my phone while I'm within proximity. I've read that I can do that via wifi or LAN or something like that but I have had absolutely no luck. Jan seems the easiest because all you have to do is something with an API key but I can't even figure that out.

Any help?

43 comments
  • Self hosting IS hard, don't beat yourself too much because of it.. After all you're trying to serve services for yourself that are usually served by companies with thousands of employees.

    A server requires knowledge, maintainance and time, it's okay to feel frustrated sometimes.

  • Just do like me - Install Ollama and OpenWebUI, install Termux on Android, connect through Termux with port forwarding.

    ssh -L 0.0.0.0:3000:ServerIP_OnLAN:3000

    And access OpenWebUI at http://127.0.0.1:3000/ on your phone browser. Or SSH forward the Ollama port to use the Ollama Android app. This requires you to be on the same LAN as the server. If you port forward SSH through your router, you can access it remotely through your public IP (If so, I'd recommend only allowing login through certs or have a rate limiter for SSH login attempts.

    The shell command will then be ssh -L 0.0.0.0:3000:YourPublicIP:3000

    But what are the chances that you run the LLM on a Linux machine and use an android to connect, like me, and not a windows machine and use an iPhone? You tell me. No specs posted...

    • Oh! Also, I’m using windows on my PC. And my phone is an iPhone.

      I’m not using Linux yet, but that is in my todo list for the future! After I get more comfortable with some more basics of self hosting.

      • Oh! Also, I’m using windows on my PC. And my phone is an iPhone.

        Okay, that's a starting place. So if this is Windows, and if you only care about access on the wireless network, then I suppose that it's probably easiest to just expose the stuff directly to other machines on the wireless network, rather than tunneling through SSH.

        You said that you have ollama running on the Windows PC. I'm not familiar with LibreChat, but it has a Web-based interface? Are you wanting to access that from a web browser on the phone?

    • Bet, I’ll try that when I get home tonight. If I don’t have success can I message you directly ?

    • ssh -L 0.0.0.0:3000:YOURPUBLICIP:3000

      If you can SSH to the LLM machine, I'd probably recommend ssh -L127.0.0.1:11434:127.0.0.1:11434 <remote hostname>. If for some reason you don't have or inadvertently bring down a firewall on your portable device, you don't want to be punching a tunnel from whatever can talk to your portable device to the LLM machine.

      (Using 11434 instead of 3000, as it looks like that's ollama's port.)

      EDIT: OP, it's going to be hard to give a reliable step-by-step, because I have no idea what your network looks like. So, for example, it's possible to have your wireless access point set up so that devices can't talk to each other at all. You might have some kind of firewall on your LLM machine, so that if they can talk to each other from the WAP's standpoint, the firewall will block traffic from your phone; you'd need to punch a hole in that. At least something (sshd for the example here, or ollama itself to the network) needs to be listening on a routable address. As DrDystopia points out, we don't even know what OS the LLM machine is running (Linux?) so giving any kind of step-by-step is going to be hard there.

      I have had absolutely no luck.

      Problem is, that doesn't say much. Like, doesn't say what you've seen.

      Do you know what the LAN IP address of your LLM machine is? Can you ping that IP address from Termux on your phone when both are on the same WiFi network ($ ping <ip-address>?) What OS is the LLM machine? If Linux, do you have sshd installed? It sounds like you do have ollama on it and that it's working if you use it from the LLM machine? When you said that it didn't work, what did you try and what errors or behavior did you see?

      • 3000 is the OpenWebUI port, never got it to work by using either 127.0.0.1 or localhost, only 0.0.0.0. Ollama's port 11434 on 127.x worked fine though.

        you don’t want to be punching a tunnel from whatever can talk to your portable device to the LLM machine.

        Fair point.

43 comments