Smartaira fiber. Best I can gather they're using a a managed switch and segmenting each port. Probably per floor. They sprcialize in large scale wifi deployment and that's what they're doing. It's a genius way to provide basic web access with a minimal hardware footprint for the provider and no hardware but a POE AP for the users. It just sucks for those of us who know better.
That's an interesting concept. I bought two weeks ago when they still had cable modems and a setup I know I could have worked with. I'm politically active so getting on the board should be an option. However, what's in the best interest of the vast, vast majority of the owners? Your standard service that requires complex gateways and running coax all over your apartment with hardware rental fees and TV number and location limits, or a system where your smart TV can connect anywhere and your iPhone can always get onto Facebook and there's a 24/7 tech support line to change your WiFi password for you? If it costs each owner $1 more per month (500 units) for my preferred network architecture so three residents can save $70 per month ($210) I would be failing in my fiduciary duty by charging the masses more so a select few can self host. We are the minority and the rest don't care.
I actually gave it 44000-65535 and it's connecting well. That's another reason why I wanted a more robust network: IOT VLAN to segregate that risk.
The setup is very strange. They don't provide a router. They took the old phone lines going to each unit (which appears to have been done in Cat5 decades ago) and put an RJ-45 end on it. That plugs into a POE powered wireless access point with two more ports on it. Plugging my laptop in, the gateway does not respond to HTTP requests. The tech who installed it said I have to call the home office to change my wireless password. I got them to disable the wireless so I could put my router on the other end but I'm either running on a network that my shady small time ISP has full control over or I'm behind a double NAT. Speeds were 900+ up and down though.
I might see if I can get the AP re-enabled and let the switch connect to it directly if that even fixes the Switch's NAT issues.
I just got a Ubiquti Dream Machine that can do fail over so the other connection won't be completely wasted but $70 per month could be saved by finding another way.
A little searching seems like Cloudflare Argo tunnels might be a good route to try. And possibly free, though I'm not opposed to paying for a better service. There seems to be a fair amount of step by step documentation on this. I'll demo this on my lab as I haven't moved it to the new apartment yet.
It depends on the app. Yes, I could run my password manager on the VPS since that takes up virtually no space or bandwidth. The odd IP camera needs to be local, the Minecraft server with mods needs local CPU power and RAM (presumably).
New apartment Internet has no port forwarding, admin login
Pro: 1Gb upload and download speeds on free Internet provided by the HOA. Con: As a self hoster, I have zero control over it. No port forwarding, no DMZ, no bridge mode. It's Starbucks free WiFi with a wired connection.
Option A: Buy Google Fiber and don't use free Internet. Option B: Create some elaborate tunnel through a VPS.
My public self hosted activities are fairly low bandwidth (password manager, SSH). I have a vague idea that I could point my domain to a low cost VPS that has a VPN tunnel into my home network for any incoming connection needs. That may require me to fill in port forwards on both systems but whatever. Tailscale is serving most of my remote needs but I still need a few ports. This does not fix the issue of online gaming port forwards (Nintendo Switch online requires a huge forwarded range for best performance) but oh well for now.
UPDATE: I think they're using this system. https://www.cambiumnetworks.com/markets/multi-family-living/ The personal Wi-Fi ov
What I need is a 10g storage for my Adobe suite that I can access from my MacBook. I need redundant, fault tolerant storage for my precious data. I need my self hosted services to be high availability. What's the minimum spec to reach that? I started on the u.2 path when I saw enterprise u.2 drives at similar cost per GB as SATA SSDs but faster and crazy endurance. And when my kid wants to run a Minecraft server with mods for him and his friends, I better have some spare CPU cycles and RAM to keep up.
Where do you find the bandwidth to do all that? NVME eats it up and the 40g too.
I'm afraid of dumping 500+ watts into a (air conditioned) closet. How are you able to saturate the 10g? I had some idea that ceph speed is that of the slowest drive, so even SATA SSDs won't fill the bucket. I imagine this is due to file redundancy not parity/striping spreading the data. I'd like to stick to lower power consumer gear but ceph looks CPU, RAM, and bandwidth (storage and network) hungry plus low latency.
I ran proxmox/ceph over 1GB on e-waste mini PCs and it was... unreliable. Now my NAS is my HA storage but I'm not thrilled to beat up QLC NAND for hobby VMs.
I looked at Epyc because I wanted to bandwidth to run u.2 drives at full speed and it wasn't until Epyc or Threadripper that you could get much more than 40 lanes in a single socket. I've got to find another way to saturate 10g and give up on 25g. My home automation is run on a Home Assistant Yellow and works perfectly, for what it does.
What's your server wattage?
I'm in the process of wiring a home before moving in and getting excited about running 10g from my server to the computer. Then I see 25g gear isn't that much more expensive so I might was well run at least one fiber line. But what kind of three node ceph monster will it take to make use of any of this bandwidth (plus run all my Proxmox VMs and LXCs in HA) and how much heat will I have to deal with. What's your experience with high speed homelab NAS builds and the electric bill shock that comes later? Epyc 7002 series looks perfect but seems to idle high.
I ended up going with tailscale. Every other option exposed my secret services to the Internet, even if behind a password. Tailscale was ridiculously easy to set up too. The docker compose I used had Heimdall in it too so I was able put all my links on there. Procedure is connect with tailscale app -> go to http://illegalshit -> click/tap on relevant link. I might pull back on my Nginx proxy targets and port forwards for this more secure system.
What happens if tailscale goes down though?
Secure portal between Internet and internal services
I thought I was going to use Authentik for this purpose but it just seems to redirect to an otherwise Internet accessible page. I'm looking for a way to remotely access my home network at a site like remote.mywebsite.com. I have Nginx proxy forwarding with SSL working appropriately, so I need an internal service that receives the traffic, logs me in, and passes me to services I don't want to expose to the Internet.
My issue with Authentik is if I need to access questionable internal websites I have to make an Internet accessible subdomain. I don't want authentik.mywebsite.com to redirect to totallyillegal.mywebsite.com. I want it to redirect to 10.1.1.30:8787.
Is there anything that does that?
Late reply but yeah, Wifi was a nightmare on Proxmox. It was a tiny e-waste SFF pc so I was able to wedge it near the other servers. The cluster is happy.
Proxmox HA, Docker Swarm, Kubrenetes, or what?
I've gotten to the point where I have more than a few servers in my homelab and am looking for a way to increase reliability in case of an update. Two problems: 2 of the servers will be on Wifi and one is a Synology NAS. I can't do any wiring but I can put together a WiFi 6E network for the servers only, That means buying 4 Wifi 6E devices in a mix of types. As for Synology, it's container manager is a little odd so I expect to run a Linux VM and use that as my cluster node. That may mean buying more RAM as I haven't upgraded it. Hardware ranges from a 6 core CPU on the NAS (with a few important docker containers), 8 core on my main SFF server (which also runs my OpnSense VM inside Proxmox), 16 core Ryzen on my old big server, and a 10 year old NUC for fun. So the question is what do I use to orchestrate all the services I have. My Vaulwarden runs reliability but only on one system. I want better reliability for Pihole that automatically syncs settings. The NAS' docker implementation
As I recall, when I turned off location tracking my time zone would be wrong. It's frustrating that you lose out when protecting your privacy.
The rabbit hole took me from Airtable to Baserow (which I have up and running but with the built in DB) but now need to make a viewer for the people. So the next step is visualizing the events list and filtering by day and whatever is needed to get useful info from the DB.
Thanks. Some screenshot of a tweet isn't a source, but what you posted was.
Is there a source that doesn't require me log into Twitter?
Upvote for Pebble. It was the best of all time. I could operate the buttons without even looking.
Pebble Time. It was the best ever until the battery swelled. My Fitbit Versa 2 was good until the battery gave out. My latest Fitbit has worse software with less features than the last and I hate it. There are no apps available anymore.
So some news outlets get to protect their precious little articles from the big bad AI, which will probably destroy news as we know it anyway
I was thinking about this. What happens when all the big outlets are having AI write their news?You can't get answers on today's news without feeding the model today's news. Therefore, somebody has to create the data source.
I see a few scenarios:
- Google scrapes, aggregates, and summarizes to the point that nobody reads the article/sees the ads and the news site goes under. Then Google has nothing to scrape but press releases and government sources. Or...
- News sites block access to scrapers and charge for it but may be wary of crossing their customers (news aggregators) in their coverage
- The above creates a tiered system where premium news outlets (AI assisted writing but with human insight) are too expensive for ad supported Google to scrape, so Google gets second tier news from less reliable, more automated sources, or simply makes it themselves. Why not cut out the middle man?
- Rouge summarizers will still scrape the real news outlets and summarize stories to sell to Google. This will again make paid news a luxury since someone with a subscription will summarize and distribute the main point (okay) or their spin (bad).
I'm failing to see where this will go well. Is there another scenario?