Skip Navigation
llama llama @lemmy.dbzer0.com
Posts 15
Comments 65
How to run LLaMA (and other LLMs) on Android.
  • I see. I don't think there there are many solutions on that front for Android. For PC there are a few, such as LM Studio.

  • How to run LLaMA (and other LLMs) on Android.
  • Thanks for your comment. That for sure is something to look out for. It is really important to know what you're running and what possible limitations there could be. Not what the original comment said, though.

  • How to run LLaMA (and other LLMs) on Android.
  • This is all very nuanced and there isn't a clear cut answer. It really depends on what you're running, for how long you're running it, your device specs, etc. The LLMs I mentioned in the post did just fine and did not cause any overheating if not used for extended periods of time. You absolutely can run a SMALL LLM and not fry your processor if you don't overdo it. Even then, I find it extremely unlikely that you're going to cause permanent damage to your hardware components.

    Of course that is something to be mindful of, but that's not what the person in the original comment said. It does run, but you need to be aware of the limitations and potential consequences. That goes without saying, though.

    Just don't overdo it. Or do, but the worst thing that will happen is your phone getting hella hot and shutting down.

  • How to run LLaMA (and other LLMs) on Android.
  • For me the biggest benefits are:

    • Your queries don't ever leave your computer
    • You don't have to trust a third party with your data
    • You know exactly what you're running
    • You can tweak most models to your liking
    • You can upload sensitive information to it and not worry about it
    • It works entirely offline
    • You can run several models
  • Help people trying to circumvent censorship by running a Snowflake proxy!
  • That's great news! I'd love for it to be added to a wiki. Just make sure that whatever version of this post is added to the wiki is the most updated one.

  • Help people trying to circumvent censorship by running a Snowflake proxy!
  • No. That has to do with how the Tor network works. The bridge forwards the connection to a non exit relay. You do not communicate with an exit relay whatsoever. The middle relay does, but the exit relay doesn't know who are are and you don't know who the exist relay is.

  • Help people trying to circumvent censorship by running a Snowflake proxy!
  • I am not entirely sure, to be completely honest. In my experience, it is very little but it varies too. It really depends on how many people connect, for how long they connect, etc. If you have limited upload speeds, maybe it wouldn't be a great idea to run it in your browser/phone. Maybe try running it directly on your computer using the -capacity flag?

    I haven't been able to find any specific numbers either, but I did find a post on the Tor Forum dated April 2023 or a user complaining about high bandwidth usage. This is not the norm in my experience, though.

  • How to run LLaMA (and other LLMs) on Android.
  • There are a few. There's Private AI. It is free (as in beer) but it's not libre (or open source). The app is a bit sketchy too, so I would still recommend doing as the tutorial says.

    Out of curiosity, why do you not want to use a terminal for that?

  • How to run LLaMA (and other LLMs) on Android.
  • I don't know that one. Is it FOSS?

  • Help people trying to circumvent censorship by running a Snowflake proxy!
  • You are completely right. That was worded poorly and a few users have thankfully pointed that out. The answer, for most people, is yes. But that depends entirely on your threat model.

    The traffic to your Snowflake proxy isn't necessarily from people in 'adversarial countries'. A Snowflake proxy is a type of bridge, so just about anyone can use it. You can use a Snowflake bridge, if you want. However, it is strongly encouraged to save bridges (including Snowflakes) to people who need them.

    So, for most people, it is generally safe to run Snowflake proxies. Theoretically, your ISP will be able to know that there are connections being made there, but, to them, it will look like you're calling someone on, say, Zoom since it uses WebRTC technology. They can't see the data, though since everything is encrypted (check the Snowflake docs and Tor Brower's for further reference). You probably won't get in any trouble for that.

    Historically, as far as we know, there haven't been any cases of people getting in legal trouble for running entry relays, middle relays, or bridges. There have a been a few cases of people running exit nodes and getting in trouble with law enforcement, but none of them have been arrested or prosecuted so far.

    If you know of any, let me know.

  • Help people trying to circumvent censorship by running a Snowflake proxy!
  • I have not used AI to write the post. I used Claude to refine it because English is not my first language. If there are any errors, that is my bad. Please point them out as you did so I can fix them.

    This has several errors including the fact that running the proxy exposes your IP address.

    Thank you for pointing that out. That was worded pretty badly. I corrected it in the post.

    For further clarification:

    The person who is connecting to your Snowflake bridge is connecting to it in a p2p like connection. So, the person does know what your IP address is, and your ISP also knows that the person's IP address is – the one that is connecting to your bridge.

    However, to both of your ISPs, it will look like both of you are using some kind of video conferencing software, such as Zoom due to Snowflake using WebRTC technology, making your traffic inconspicuous and obfuscating to both of your ISPs what's actually going on.

    To most people, that is not something of concern. But, ultimately, that comes down to your threat model. Historically, there haven't any cases of people running bridges or entry and middle relays and getting in trouble with law enforcement.

    So, will you get in any trouble for running a Snowflake bridge? The answer is quite probably no.

    For clarification, you're not acting as an exit node if you're running a snowflake proxy. Please, check Tor's documentation and Snowflake's documentation.

  • Help people trying to circumvent censorship by running a Snowflake proxy!
  • Thank you for pointing that out. That was worded pretty badly. I corrected it in the post.

    For further clarification:

    The person who is connecting to your Snowflake bridge is connecting to it in a p2p like connection. So, the person does know what your IP address is, and your ISP also knows that the person's IP address is – the one that is connecting to your bridge.

    However, to both of your ISPs, it will look like both of you are using some kind of video conferencing software, such as Zoom due to Snowflake using WebRTC technology, making your traffic inconspicuous and obfuscating to both of your ISPs what's actually going on.

    To most people, that is not something of concern. But, ultimately, that comes down to your threat model. Historically, there haven't any cases of people running bridges or entry and middle relays and getting in trouble with law enforcement.

    So, will you get in any trouble for running a Snowflake bridge? The answer is quite probably no.

    For clarification, you're not acting as an exit node if you're running a snowflake proxy. Please, check Tor's documentation and Snowflake's documentation.

  • How to run LLaMA (and other LLMs) on Android.
  • Not true. If you load a model that is below your phone's hardware capabilities it simply won't open. Stop spreading fud.

  • How to run LLaMA (and other LLMs) on Android.

    cross-posted from: https://lemmy.dbzer0.com/post/36841328 >Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices! > > --- > > ### Step 1: Install Termux > 1. Download and install Termux from the Google Play Store or F-Droid > --- > > ### Step 2: Set Up proot-distro and Install Debian > 1. Open Termux and update the package list: > bash > pkg update && pkg upgrade > > > 2. Install proot-distro > bash > pkg install proot-distro > > > 3. Install Debian using proot-distro: > bash > proot-distro install debian > > > 4. Log in to the Debian environment: > bash > proot-distro login debian > > You will need to log-in every time you want to run Ollama. You will need to repeat this step and all the steps below every time you want to run a model (excluding step 3 and the first half of step 4). > > --- > > ### Step 3: Install Dependencies > 1. Update the package list in Debian: > bash > apt update && apt upgrade > > > 2. Install curl: > bash > apt install curl > > > --- > > ### Step 4: Install Ollama > 1. Run the following command to download and install Ollama: > bash > curl -fsSL https://ollama.com/install.sh | sh > > > 2. Start the Ollama server: > bash > ollama serve & > > After you run this command, do ctrl + c and the server will continue to run in the background. > --- > > ### Step 5: Download and run the Llama3.2:1B Model > 1. Use the following command to download the Llama3.2:1B model: > bash > ollama run llama3.2:1b > > This step fetches and runs the lightweight 1-billion-parameter version of the Llama 3.2 model . > > --- > > Running LLaMA and other similar models on Android devices is definitely achievable, even with mid-range hardware. The performance varies depending on the model size and your device's specifications, but with some experimentation, you can find a setup that works well for your needs. I’ll make sure to keep this post updated if there are any new developments or additional tips that could help improve the experience. If you have any questions or suggestions, feel free to share them below! > > – llama >

    0

    How to run LLaMA (and other LLMs) on Android.

    cross-posted from: https://lemmy.dbzer0.com/post/36841328 >Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices! > > --- > > ### Step 1: Install Termux > 1. Download and install Termux from the Google Play Store or F-Droid > --- > > ### Step 2: Set Up proot-distro and Install Debian > 1. Open Termux and update the package list: > bash > pkg update && pkg upgrade > > > 2. Install proot-distro > bash > pkg install proot-distro > > > 3. Install Debian using proot-distro: > bash > proot-distro install debian > > > 4. Log in to the Debian environment: > bash > proot-distro login debian > > You will need to log-in every time you want to run Ollama. You will need to repeat this step and all the steps below every time you want to run a model (excluding step 3 and the first half of step 4). > > --- > > ### Step 3: Install Dependencies > 1. Update the package list in Debian: > bash > apt update && apt upgrade > > > 2. Install curl: > bash > apt install curl > > > --- > > ### Step 4: Install Ollama > 1. Run the following command to download and install Ollama: > bash > curl -fsSL https://ollama.com/install.sh | sh > > > 2. Start the Ollama server: > bash > ollama serve & > > After you run this command, do ctrl + c and the server will continue to run in the background. > --- > > ### Step 5: Download and run the Llama3.2:1B Model > 1. Use the following command to download the Llama3.2:1B model: > bash > ollama run llama3.2:1b > > This step fetches and runs the lightweight 1-billion-parameter version of the Llama 3.2 model . > > --- > > Running LLaMA and other similar models on Android devices is definitely achievable, even with mid-range hardware. The performance varies depending on the model size and your device's specifications, but with some experimentation, you can find a setup that works well for your needs. I’ll make sure to keep this post updated if there are any new developments or additional tips that could help improve the experience. If you have any questions or suggestions, feel free to share them below! > > – llama >

    2

    How to run LLaMA (and other LLMs) on Android.

    cross-posted from: https://lemmy.dbzer0.com/post/36841328 >Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices! > > --- > > ### Step 1: Install Termux > 1. Download and install Termux from the Google Play Store or F-Droid > --- > > ### Step 2: Set Up proot-distro and Install Debian > 1. Open Termux and update the package list: > bash > pkg update && pkg upgrade > > > 2. Install proot-distro > bash > pkg install proot-distro > > > 3. Install Debian using proot-distro: > bash > proot-distro install debian > > > 4. Log in to the Debian environment: > bash > proot-distro login debian > > You will need to log-in every time you want to run Ollama. You will need to repeat this step and all the steps below every time you want to run a model (excluding step 3 and the first half of step 4). > > --- > > ### Step 3: Install Dependencies > 1. Update the package list in Debian: > bash > apt update && apt upgrade > > > 2. Install curl: > bash > apt install curl > > > --- > > ### Step 4: Install Ollama > 1. Run the following command to download and install Ollama: > bash > curl -fsSL https://ollama.com/install.sh | sh > > > 2. Start the Ollama server: > bash > ollama serve & > > After you run this command, do ctrl + c and the server will continue to run in the background. > --- > > ### Step 5: Download and run the Llama3.2:1B Model > 1. Use the following command to download the Llama3.2:1B model: > bash > ollama run llama3.2:1b > > This step fetches and runs the lightweight 1-billion-parameter version of the Llama 3.2 model . > > --- > > Running LLaMA and other similar models on Android devices is definitely achievable, even with mid-range hardware. The performance varies depending on the model size and your device's specifications, but with some experimentation, you can find a setup that works well for your needs. I’ll make sure to keep this post updated if there are any new developments or additional tips that could help improve the experience. If you have any questions or suggestions, feel free to share them below! > > – llama >

    18

    How to run LLaMA (and other LLMs) on Android.

    cross-posted from: https://lemmy.dbzer0.com/post/36841328 >Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices! > > --- > > ### Step 1: Install Termux > 1. Download and install Termux from the Google Play Store or F-Droid > --- > > ### Step 2: Set Up proot-distro and Install Debian > 1. Open Termux and update the package list: > bash > pkg update && pkg upgrade > > > 2. Install proot-distro > bash > pkg install proot-distro > > > 3. Install Debian using proot-distro: > bash > proot-distro install debian > > > 4. Log in to the Debian environment: > bash > proot-distro login debian > > You will need to log-in every time you want to run Ollama. You will need to repeat this step and all the steps below every time you want to run a model (excluding step 3 and the first half of step 4). > > --- > > ### Step 3: Install Dependencies > 1. Update the package list in Debian: > bash > apt update && apt upgrade > > > 2. Install curl: > bash > apt install curl > > > --- > > ### Step 4: Install Ollama > 1. Run the following command to download and install Ollama: > bash > curl -fsSL https://ollama.com/install.sh | sh > > > 2. Start the Ollama server: > bash > ollama serve & > > After you run this command, do ctrl + c and the server will continue to run in the background. > --- > > ### Step 5: Download and run the Llama3.2:1B Model > 1. Use the following command to download the Llama3.2:1B model: > bash > ollama run llama3.2:1b > > This step fetches and runs the lightweight 1-billion-parameter version of the Llama 3.2 model . > > --- > > Running LLaMA and other similar models on Android devices is definitely achievable, even with mid-range hardware. The performance varies depending on the model size and your device's specifications, but with some experimentation, you can find a setup that works well for your needs. I’ll make sure to keep this post updated if there are any new developments or additional tips that could help improve the experience. If you have any questions or suggestions, feel free to share them below! > > – llama >

    9

    How to run LLaMA (and other LLMs) on Android.

    cross-posted from: https://lemmy.dbzer0.com/post/36841328 >Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices! > > --- > > ### Step 1: Install Termux > 1. Download and install Termux from the Google Play Store or F-Droid > --- > > ### Step 2: Set Up proot-distro and Install Debian > 1. Open Termux and update the package list: > bash > pkg update && pkg upgrade > > > 2. Install proot-distro > bash > pkg install proot-distro > > > 3. Install Debian using proot-distro: > bash > proot-distro install debian > > > 4. Log in to the Debian environment: > bash > proot-distro login debian > > You will need to log-in every time you want to run Ollama. You will need to repeat this step and all the steps below every time you want to run a model (excluding step 3 and the first half of step 4). > > --- > > ### Step 3: Install Dependencies > 1. Update the package list in Debian: > bash > apt update && apt upgrade > > > 2. Install curl: > bash > apt install curl > > > --- > > ### Step 4: Install Ollama > 1. Run the following command to download and install Ollama: > bash > curl -fsSL https://ollama.com/install.sh | sh > > > 2. Start the Ollama server: > bash > ollama serve & > > After you run this command, do ctrl + c and the server will continue to run in the background. > --- > > ### Step 5: Download and run the Llama3.2:1B Model > 1. Use the following command to download the Llama3.2:1B model: > bash > ollama run llama3.2:1b > > This step fetches and runs the lightweight 1-billion-parameter version of the Llama 3.2 model . > > --- > > Running LLaMA and other similar models on Android devices is definitely achievable, even with mid-range hardware. The performance varies depending on the model size and your device's specifications, but with some experimentation, you can find a setup that works well for your needs. I’ll make sure to keep this post updated if there are any new developments or additional tips that could help improve the experience. If you have any questions or suggestions, feel free to share them below! > > – llama >

    0

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616 > > # Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android) > > Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works: > > --- > > ## What Is Snowflake? > Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a temporary bridge, not a permanent node, ensuring both safety and ease of use. > > --- > > ### Is This Safe for Me? > > Short answer: Yes. > > Long answer: pobably. Here is why: > > - Your IP address is not exposed to the websites they access. So, you don't have to worry about what they are doing either. You are not an exit node. > - No activity logs. Snowflake cannot monitor or record what users do through your connection. The only stored information is how many people have connected to your bridge. Check docs for further info on this. > - Low resource usage. The data consumed is comparable to background app activity—far less than streaming video or music. > - No direct access to your system > - No storage of sensitive data. Snowflake proxies do not store any sensitive data, such as IP addresses or browsing history, on your system. > - Encrypted communication. All communication between the Snowflake proxy and the Tor network is encrypted, making it difficult for attackers to intercept or manipulate data. > > You are not hosting a VPN or a full Tor relay. Your role is limited to facilitating encrypted connections, similar to relaying a sealed envelope. > >Your IP address is exposed to the user (in a P2P-like connection). Be mindful that your ISP could also potentially see the WebRTC traffic and the connections being made to it (but not the contents), so be mindful of your threat model. > > For most users, it is generally safe to run Snowflake proxies. Theoretically, your ISP will be able to know that there are connections being made there, but to them it will look like you're calling someone on, say, Zoom. > > Historically, as far as we know, there haven't been any cases of people getting in legal trouble for running entry relays, middle relays, or bridges. There have a been a few cases of people running exit nodes and getting in trouble with law enforcement agencies, but none of them have been arrested or prosecuted as far as I know it. If you are aware of any cases, let me know so I can update this post. > >Do not hesitate to check Snowflake's official documentation for further reference and to make informed decisions. > > --- > > ## How to Set Up a Snowflake Proxy > > ### Option 1: Browser Extension (Brave, Firefox, or Chrome) > 1. Install the Snowflake extension. > 2. Click the Snowflake icon in your browser toolbar and toggle "Enable Snowflake." > 3. Keep the browser open. That’s all. > > Note: Brave users can enable Snowflake directly in settings. Navigate to brave://settings/privacy and activate the option under "Privacy and security." > > --- > > ### Option 2: Android Devices via Orbot > 1. Download Orbot (Tor’s official Android app). > 2. Open the app’s menu, select "Snowflake Proxy," and toggle it on. > 3. For continuous operation, keep your device charged and connected to Wi-Fi. > > Your device will now contribute as a proxy whenever the app is active. > > --- > > ### Addressing Common Concerns > - Battery drain: Negligible. Snowflake consumes fewer resources than typical social media or messaging apps. > - Data usage: Most users report under 1 GB per month. Adjust data limits in Orbot’s settings or restrict operation to Wi-Fi if necessary. > > --- > > ## Why Your Participation Matters > Censorship mechanisms grow more sophisticated every year, but tools like Snowflake empower ordinary users to counteract them. Each proxy strengthens the Tor network’s resilience, making it harder for authoritarian regimes to isolate their populations. By donating a small amount of bandwidth, you provide someone with a critical connection to uncensored information, education, and global dialogue. > > Recent surges in demand—particularly in Russia—highlight the urgent need for more proxies. Your contribution, however small, has an impact. > > By participating, you become part of a global effort to defend digital rights and counter censorship. Please, also be mindful of your threat mode and understand the potential risks (though very little for most people). Check Snowflake's official documentation for further reference and don't make any decisions based on this post before taking your time to read through it. > > Please share this post to raise awareness. The more proxies, the stronger the network. > > – llama

    0

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616 > > # Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android) > > Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works: > > --- > > ## What Is Snowflake? > Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a temporary bridge, not a permanent node, ensuring both safety and ease of use. > > --- > > ### Is This Safe for Me? > > Short answer: Yes. > > Long answer: probably. Here is why: > > - Your IP address is not exposed to the websites they access. So, you don't have to worry about what they are doing either. You are not an exit node. > - No activity logs. Snowflake cannot monitor or record what users do through your connection. The only stored information is how many people have connected to your bridge. Check docs for further info on this. > - Low resource usage. The data consumed is comparable to background app activity—far less than streaming video or music. There have been, however, a few cases of people reporting high network usage. > - No direct access to your system > - No storage of sensitive data. Snowflake proxies do not store any sensitive data, such as IP addresses or browsing history, on your system. > - Encrypted communication. All communication between the Snowflake proxy and the Tor network is encrypted, making it difficult for attackers to intercept or manipulate data. > > You are not hosting a VPN or a full Tor relay. Your role is limited to facilitating encrypted connections, similar to relaying a sealed envelope. > >Your IP address is exposed to the user (in a P2P-like connection). Be mindful that your ISP could also potentially see the WebRTC traffic and the connections being made to it (but not the contents), so be mindful of your threat model. > > For most users, it is generally safe to run Snowflake proxies. Theoretically, your ISP will be able to know that there are connections being made there, but to them it will look like you're calling someone on, say, Zoom. > > Historically, as far as we know, there haven't been any cases of people getting in legal trouble for running entry relays, middle relays, or bridges. There have a been a few cases of people running exit nodes and getting in trouble with law enforcement agencies, but none of them have been arrested or prosecuted as far as I know it. If you are aware of any cases, let me know so I can update this post. > >Do not hesitate to check Snowflake's official documentation for further reference and to make informed decisions. > > --- > > ## How to Set Up a Snowflake Proxy > > ### Option 1: Browser Extension (Brave, Firefox, or Chrome) > 1. Install the Snowflake extension. > 2. Click the Snowflake icon in your browser toolbar and toggle "Enable Snowflake." > 3. Keep the browser open. That’s all. > > Note: Brave users can enable Snowflake directly in settings. Navigate to brave://settings/privacy and activate the option under "Privacy and security." > > --- > > ### Option 2: Android Devices via Orbot > 1. Download Orbot (Tor’s official Android app). > 2. Open the app’s menu, select "Snowflake Proxy," and toggle it on. > 3. For continuous operation, keep your device charged and connected to Wi-Fi. > > Your device will now contribute as a proxy whenever the app is active. > > --- > > ### Addressing Common Concerns > - Battery drain: Negligible. Snowflake consumes fewer resources than typical social media or messaging apps. > - Data usage: Most users report under 1 GB per month. Adjust data limits in Orbot’s settings or restrict operation to Wi-Fi if necessary. > > --- > > ## Why Your Participation Matters > Censorship mechanisms grow more sophisticated every year, but tools like Snowflake empower ordinary users to counteract them. Each proxy strengthens the Tor network’s resilience, making it harder for authoritarian regimes to isolate their populations. By donating a small amount of bandwidth, you provide someone with a critical connection to uncensored information, education, and global dialogue. > > Recent surges in demand—particularly in Russia—highlight the urgent need for more proxies. Your contribution, however small, has an impact. > > By participating, you become part of a global effort to defend digital rights and counter censorship. Please, also be mindful of your threat mode and understand the potential risks (though very little for most people). Check Snowflake's official documentation for further reference and don't make any decisions based on this post before taking your time to read through it. > > Please share this post to raise awareness. The more proxies, the stronger the network. > > – llama

    3

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616 > > # Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android) > > Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works: > > --- > > ## What Is Snowflake? > Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a temporary bridge, not a permanent node, ensuring both safety and ease of use. > > --- > > ### Is This Safe for Me? > > Short answer: Yes. > > Long answer: pobably. Here is why: > > - Your IP address is not exposed to the websites they access. So, you don't have to worry about what they are doing either. You are not an exit node. > - No activity logs. Snowflake cannot monitor or record what users do through your connection. The only stored information is how many people have connected to your bridge. Check docs for further info on this. > - Low resource usage. The data consumed is comparable to background app activity—far less than streaming video or music. > - No direct access to your system > - No storage of sensitive data. Snowflake proxies do not store any sensitive data, such as IP addresses or browsing history, on your system. > - Encrypted communication. All communication between the Snowflake proxy and the Tor network is encrypted, making it difficult for attackers to intercept or manipulate data. > > You are not hosting a VPN or a full Tor relay. Your role is limited to facilitating encrypted connections, similar to relaying a sealed envelope. > >Your IP address is exposed to the user (in a P2P-like connection). Be mindful that your ISP could also potentially see the WebRTC traffic and the connections being made to it (but not the contents), so be mindful of your threat model. > > For most users, it is generally safe to run Snowflake proxies. Theoretically, your ISP will be able to know that there are connections being made there, but to them it will look like you're calling someone on, say, Zoom. > > Historically, as far as we know, there haven't been any cases of people getting in legal trouble for running entry relays, middle relays, or bridges. There have a been a few cases of people running exit nodes and getting in trouble with law enforcement agencies, but none of them have been arrested or prosecuted as far as I know it. If you are aware of any cases, let me know so I can update this post. > >Do not hesitate to check Snowflake's official documentation for further reference and to make informed decisions. > > --- > > ## How to Set Up a Snowflake Proxy > > ### Option 1: Browser Extension (Brave, Firefox, or Chrome) > 1. Install the Snowflake extension. > 2. Click the Snowflake icon in your browser toolbar and toggle "Enable Snowflake." > 3. Keep the browser open. That’s all. > > Note: Brave users can enable Snowflake directly in settings. Navigate to brave://settings/privacy and activate the option under "Privacy and security." > > --- > > ### Option 2: Android Devices via Orbot > 1. Download Orbot (Tor’s official Android app). > 2. Open the app’s menu, select "Snowflake Proxy," and toggle it on. > 3. For continuous operation, keep your device charged and connected to Wi-Fi. > > Your device will now contribute as a proxy whenever the app is active. > > --- > > ### Addressing Common Concerns > - Battery drain: Negligible. Snowflake consumes fewer resources than typical social media or messaging apps. > - Data usage: Most users report under 1 GB per month. Adjust data limits in Orbot’s settings or restrict operation to Wi-Fi if necessary. > > --- > > ## Why Your Participation Matters > Censorship mechanisms grow more sophisticated every year, but tools like Snowflake empower ordinary users to counteract them. Each proxy strengthens the Tor network’s resilience, making it harder for authoritarian regimes to isolate their populations. By donating a small amount of bandwidth, you provide someone with a critical connection to uncensored information, education, and global dialogue. > > Recent surges in demand—particularly in Russia—highlight the urgent need for more proxies. Your contribution, however small, has an impact. > > By participating, you become part of a global effort to defend digital rights and counter censorship. Please, also be mindful of your threat mode and understand the potential risks (though very little for most people). Check Snowflake's official documentation for further reference and don't make any decisions based on this post before taking your time to read through it. > > Please share this post to raise awareness. The more proxies, the stronger the network. > > – llama

    13
    How to run LLaMA (and other LLMs) on Android.
  • Though apparently I didn't need step 6 as it started running after I downloaded it

    Hahahha. It really is a little redundant, now that you mention it. I'll remove it from the post. Thank you!

    Good fun. Got me interested in running local LLM for the first time.

    I'm very happy to hear my post motivated you to run an LLM locally for the first time! Did you manage to run any other models? How was your experience? Let us know!

    What type of performance increase should I expect when I spin this up on my 3070 ti?

    That really depends on the model, to be completely honest. Make sure to check the model requirements. For llama3.2:2b you can expect a significant performance increase, at least.

  • Help people trying to circumvent censorship by running a Snowflake proxy!
  • Of course! I run several snowflake proxies across my devices and their browsers.

  • Help people trying to circumvent censorship by running a Snowflake proxy!
  • I didn't use an LLM to make the post. I did, however, use Claude to make it clearer since English is not my first language. I hope that answers your question.

  • How to run LLaMA (and other LLMs) on Android.
  • I have tried on more or less 5 spare phones. None of them have less than 4 GB of RAM, however.

  • Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616 > > # Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android) > > Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works: > > --- > > ## What Is Snowflake? > Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a temporary bridge, not a permanent node, ensuring both safety and ease of use. > > --- > > ### Is This Safe for Me? > > Short answer: Yes. > > Long answer: pobably. Here is why: > > - Your IP address is not exposed to the websites they access. So, you don't have to worry about what they are doing either. You are not an exit node. > - No activity logs. Snowflake cannot monitor or record what users do through your connection. The only stored information is how many people have connected to your bridge. Check docs for further info on this. > - Low resource usage. The data consumed is comparable to background app activity—far less than streaming video or music. > - No direct access to your system > - No storage of sensitive data. Snowflake proxies do not store any sensitive data, such as IP addresses or browsing history, on your system. > - Encrypted communication. All communication between the Snowflake proxy and the Tor network is encrypted, making it difficult for attackers to intercept or manipulate data. > > You are not hosting a VPN or a full Tor relay. Your role is limited to facilitating encrypted connections, similar to relaying a sealed envelope. > >Your IP address is exposed to the user (in a P2P-like connection). Be mindful that your ISP could also potentially see the WebRTC traffic and the connections being made to it (but not the contents), so be mindful of your threat model. > > For most users, it is generally safe to run Snowflake proxies. Theoretically, your ISP will be able to know that there are connections being made there, but to them it will look like you're calling someone on, say, Zoom. > > Historically, as far as we know, there haven't been any cases of people getting in legal trouble for running entry relays, middle relays, or bridges. There have a been a few cases of people running exit nodes and getting in trouble with law enforcement agencies, but none of them have been arrested or prosecuted as far as I know it. If you are aware of any cases, let me know so I can update this post. > >Do not hesitate to check Snowflake's official documentation for further reference and to make informed decisions. > > --- > > ## How to Set Up a Snowflake Proxy > > ### Option 1: Browser Extension (Brave, Firefox, or Chrome) > 1. Install the Snowflake extension. > 2. Click the Snowflake icon in your browser toolbar and toggle "Enable Snowflake." > 3. Keep the browser open. That’s all. > > Note: Brave users can enable Snowflake directly in settings. Navigate to brave://settings/privacy and activate the option under "Privacy and security." > > --- > > ### Option 2: Android Devices via Orbot > 1. Download Orbot (Tor’s official Android app). > 2. Open the app’s menu, select "Snowflake Proxy," and toggle it on. > 3. For continuous operation, keep your device charged and connected to Wi-Fi. > > Your device will now contribute as a proxy whenever the app is active. > > --- > > ### Addressing Common Concerns > - Battery drain: Negligible. Snowflake consumes fewer resources than typical social media or messaging apps. > - Data usage: Most users report under 1 GB per month. Adjust data limits in Orbot’s settings or restrict operation to Wi-Fi if necessary. > > --- > > ## Why Your Participation Matters > Censorship mechanisms grow more sophisticated every year, but tools like Snowflake empower ordinary users to counteract them. Each proxy strengthens the Tor network’s resilience, making it harder for authoritarian regimes to isolate their populations. By donating a small amount of bandwidth, you provide someone with a critical connection to uncensored information, education, and global dialogue. > > Recent surges in demand—particularly in Russia—highlight the urgent need for more proxies. Your contribution, however small, has an impact. > > By participating, you become part of a global effort to defend digital rights and counter censorship. Please, also be mindful of your threat mode and understand the potential risks (though very little for most people). Check Snowflake's official documentation for further reference and don't make any decisions based on this post before taking your time to read through it. > > Please share this post to raise awareness. The more proxies, the stronger the network. > > – llama

    3

    Help people trying to circumvent censorship by running a Snowflake proxy!

    cross-posted from: https://lemmy.dbzer0.com/post/36880616 > > # Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android) > > Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works: > > --- > > ## What Is Snowflake? > Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a temporary bridge, not a permanent node, ensuring both safety and ease of use. > > --- > > ### Is This Safe for Me? > > Short answer: Yes. > > Long answer: pobably. Here is why: > > - Your IP address is not exposed to the websites they access. So, you don't have to worry about what they are doing either. You are not an exit node. > - No activity logs. Snowflake cannot monitor or record what users do through your connection. The only stored information is how many people have connected to your bridge. Check docs for further info on this. > - Low resource usage. The data consumed is comparable to background app activity—far less than streaming video or music. > - No direct access to your system > - No storage of sensitive data. Snowflake proxies do not store any sensitive data, such as IP addresses or browsing history, on your system. > - Encrypted communication. All communication between the Snowflake proxy and the Tor network is encrypted, making it difficult for attackers to intercept or manipulate data. > > You are not hosting a VPN or a full Tor relay. Your role is limited to facilitating encrypted connections, similar to relaying a sealed envelope. > >Your IP address is exposed to the user (in a P2P-like connection). Be mindful that your ISP could also potentially see the WebRTC traffic and the connections being made to it (but not the contents), so be mindful of your threat model. > > For most users, it is generally safe to run Snowflake proxies. Theoretically, your ISP will be able to know that there are connections being made there, but to them it will look like you're calling someone on, say, Zoom. > > Historically, as far as we know, there haven't been any cases of people getting in legal trouble for running entry relays, middle relays, or bridges. There have a been a few cases of people running exit nodes and getting in trouble with law enforcement agencies, but none of them have been arrested or prosecuted as far as I know it. If you are aware of any cases, let me know so I can update this post. > >Do not hesitate to check Snowflake's official documentation for further reference and to make informed decisions. > > --- > > ## How to Set Up a Snowflake Proxy > > ### Option 1: Browser Extension (Brave, Firefox, or Chrome) > 1. Install the Snowflake extension. > 2. Click the Snowflake icon in your browser toolbar and toggle "Enable Snowflake." > 3. Keep the browser open. That’s all. > > Note: Brave users can enable Snowflake directly in settings. Navigate to brave://settings/privacy and activate the option under "Privacy and security." > > --- > > ### Option 2: Android Devices via Orbot > 1. Download Orbot (Tor’s official Android app). > 2. Open the app’s menu, select "Snowflake Proxy," and toggle it on. > 3. For continuous operation, keep your device charged and connected to Wi-Fi. > > Your device will now contribute as a proxy whenever the app is active. > > --- > > ### Addressing Common Concerns > - Battery drain: Negligible. Snowflake consumes fewer resources than typical social media or messaging apps. > - Data usage: Most users report under 1 GB per month. Adjust data limits in Orbot’s settings or restrict operation to Wi-Fi if necessary. > > --- > > ## Why Your Participation Matters > Censorship mechanisms grow more sophisticated every year, but tools like Snowflake empower ordinary users to counteract them. Each proxy strengthens the Tor network’s resilience, making it harder for authoritarian regimes to isolate their populations. By donating a small amount of bandwidth, you provide someone with a critical connection to uncensored information, education, and global dialogue. > > Recent surges in demand—particularly in Russia—highlight the urgent need for more proxies. Your contribution, however small, has an impact. > > By participating, you become part of a global effort to defend digital rights and counter censorship. Please, also be mindful of your threat mode and understand the potential risks (though very little for most people). Check Snowflake's official documentation for further reference and don't make any decisions based on this post before taking your time to read through it. > > Please share this post to raise awareness. The more proxies, the stronger the network. > > – llama

    12

    How to run LLaMA (and other LLMs) on Android.

    Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices!

    ---

    Step 1: Install Termux

    1. Download and install Termux from the Google Play Store or F-Droid ---

    Step 2: Set Up proot-distro and Install Debian

    1. Open Termux and update the package list: bash pkg update && pkg upgrade

    2. Install proot-distro bash pkg install proot-distro

    3. Install Debian using proot-distro: bash proot-distro install debian

    4. Log in to the Debian environment: bash proot-distro login debian You will need to log-in every time you want to run Ollama. You will need to repeat this step and all the steps below every time you want to run a model (excluding step 3 and the first half of step 4).

    ---

    Step 3: Install Dependencies

    1. Update the package list in Debian: bash apt update && apt upgrade

    2. Install curl: bash apt install curl

    ---

    Step 4: Install Ollama

    1. Run the following command to download and install Ollama: bash curl -fsSL https://ollama.com/install.sh | sh

    2. Start the Ollama server: bash ollama serve & After you run this command, do ctrl + c and the server will continue to run in the background. ---

    Step 5: Download and run the Llama3.2:1B Model

    1. Use the following command to download the Llama3.2:1B model: bash ollama run llama3.2:1b This step fetches and runs the lightweight 1-billion-parameter version of the Llama 3.2 model .

    ---

    Running LLaMA and other similar models on Android devices is definitely achievable, even with mid-range hardware. The performance varies depending on the model size and your device's specifications, but with some experimentation, you can find a setup that works well for your needs. I’ll make sure to keep this post updated if there are any new developments or additional tips that could help improve the experience. If you have any questions or suggestions, feel free to share them below!

    – llama

    17

    How do you feel about your content getting scraped by AI models?

    I created this account two days ago, but one of my posts ended up in the (metaphorical) hands of an AI powered search engine that has scraping capabilities. What do you guys think about this? How do you feel about your posts/content getting scraped off of the web and potentially being used by AI models and/or AI powered tools? Curious to hear your experiences and thoughts on this.

    -------------

    #Prompt Update

    !

    > The prompt was something like, What do you know about the user llama@lemmy.dbzer0.com on Lemmy? What can you tell me about his interests?" Initially, it generated a lot of fabricated information, but it would still include one or two accurate details. When I ran the test again, the response was much more accurate compared to the first attempt. It seems that as my account became more established, it became easier for the crawlers to find relevant information.

    > It even talked about this very post on item 3 and on the second bullet point of the "Notable Posts" section.

    For more information, check this comment.

    ----

    Edit¹: This is Perplexity. Perplexity AI employs data scraping techniques to gather information from various online sources, which it then utilizes to feed its large language models (LLMs) for generating responses to user queries. The scraping process involves automated crawlers that index and extract content from websites, including articles, summaries, and other relevant data. It is an advanced conversational search engine that enhances the research experience by providing concise, sourced answers to user queries. It operates by leveraging AI language models, such as GPT-4, to analyze information from various sources on the web. (12/28/2024)

    Edit²: One could argue that data scraping by services like Perplexity may raise privacy concerns because it collects and processes vast amounts of online information without explicit user consent, potentially including personal data, comments, or content that individuals may have posted without expecting it to be aggregated and/or analyzed by AI systems. One could also argue that this indiscriminate collection raise questions about data ownership, proper attribution, and the right to control how one's digital footprint is used in training AI models. (12/28/2024)

    Edit³: I added the second image to the post and its description. (12/29/2024).

    96

    AI Data Scrapers: How do you feel about them?

    I created this account two days ago, but one of my posts ended up in the (metaphorical) hands of an AI powered search engine that has scraping capabilities. What do you guys think about this? How do you feel about your posts/content getting scraped off of the web and potentially being used by AI models and/or AI powered tools? Curious to hear your experiences and thoughts on this.

    -------------

    #Prompt Update

    !

    > The prompt was something like, What do you know about the user llama@lemmy.dbzer0.com on Lemmy? What can you tell me about his interests?" Initially, it generated a lot of fabricated information, but it would still include one or two accurate details. When I ran the test again, the response was much more accurate compared to the first attempt. It seems that as my account became more established, it became easier for the crawlers to find relevant information.

    > It even talked about this very post on item 3 and on the second bullet point of the "Notable Posts" section.

    For more information, check this comment.

    ----

    Edit¹: This is Perplexity. Perplexity AI employs data scraping techniques to gather information from various online sources, which it then utilizes to feed its large language models (LLMs) for generating responses to user queries. The scraping process involves automated crawlers that index and extract content from websites, including articles, summaries, and other relevant data. It is an advanced conversational search engine that enhances the research experience by providing concise, sourced answers to user queries. It operates by leveraging AI language models, such as GPT-4, to analyze information from various sources on the web. (12/28/2024)

    Edit²: One could argue that data scraping by services like Perplexity may raise privacy concerns because it collects and processes vast amounts of online information without explicit user consent, potentially including personal data, comments, or content that individuals may have posted without expecting it to be aggregated and/or analyzed by AI systems. One could also argue that this indiscriminate collection raise questions about data ownership, proper attribution, and the right to control how one's digital footprint is used in training AI models. (12/28/2024)

    Edit³: I added the second image to the post and its description. (12/29/2024).

    5

    Is Threads fully integrated with the Fediverse?

    I use both Threads and Mastodon. However, I realized that sometimes (public) profiles on Threads don't show up on Mastodon and vice versa. I also realized that most comments made on Threads posts don't show up on Mastodon – that is, if the posts appear on Mastodon at all. The same is true the other way around. Why does this happen?

    19

    Are there any mental health communities here on Lemmy?

    I've been using Lemmy since the Reddit exodus. I haven't looked back since, but I miss a lot of mental health communities that I haven't been able to find replacements for here on Lemmy. Does anyone know any cool mental health communities that are somewhat active?

    16