Skip Navigation

User banner
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)P
Posts
0
Comments
301
Joined
2 yr. ago

  • Double yeah

  • Yeah...

  • It was just charging infrastructure and reasonable range designs that were needed all along?! Who could've seen that coming.

  • I think they overplayed some situations (For example the scene where murderbot is ambushed by sec unit who is ambushed by the bug after punching eggs), but overall I've really been enjoying the show. It's got a certain lighthearted nature to it that I feel I could watch for quite some time.

  • Actually. We're starting a new tax just for solar and wind installations while boosting funding for petroleum and coal... 😩 I'm so fucking tired.

  • They're different in their implementation. Zigbee automesh is more of a centralized router-hub model with self healing relying on routing tables. This caused significant issues for me. Thread is true automesh with all devices acting as a hub in a hub/spoke model, so there's no centralized routing table to act as a single point of failure.

  • Unscheduled home upgrade time ;)

  • An important difference between thread and zigbee/wi-fi I'm not seeing mentioned is that all thread devices automesh in a hub/spoke model as long as they're not battery powered. So your light bulbs, plugs, etc all become extenders and part of a self healing mesh network without a single point of failure. For me it works better than Zigbee for this reason.

  • Actually, that depends on the llm product, as it's very very rare to interact with a raw llm these days. Most use programmed toolsets now to perform certain tasks, chatgpt for example uses a Bing api to retrieve and link urls, so it's as good as Bing for validation in that case. Which still is not very good. Most search engines actually suck at providing a secure experience for the end user. Google has gotten better, if not downright brilliant, but they don't vet their ads (cashflow) like their linked urls so it's a moot point to the end user in that case imo.

  • People use Google for urls and it continues to deliver malware to them, often even sponsored as "ads". Not sure why chatgpt would be any different.

  • You mean the chocolate flavored sugared palm oil? Might be more expensive to bake the biscuit, lol.

  • The 2nd amendment, at least for now.

  • Depends on the drive too, I have some insanely loud Ironwolf drives and you would never guess they're from the same manufacturer as my practically silent Exos X18s.

  • What hurts most is watching them hurt themselves while pretending to come from the moral high ground. I'm not sure if it's a CIA pysop or what, but something just happens in someone's brain and it seems like they stop caring about reality as long as you give them somewhere to channel their hate. You did the right thing though. If we all stopped playing their bullshit game and refused to work with the people sabotaging the rest of us we'd have everything fixed in just a few years.

  • You're missing the point though maybe? You can't take data, run it through what is essentially lossy compression, and then get the same data back out. Best you can do is a facsimile of it that suffers in some regard.

  • Looking at the network activity of a pixel device vs an iPhone at rest broke my soul.

  • Also and I'd forgotten to mention, but what you see in the on-screen representation is entirely divorced from the actual stack doing your driving. They're basically running a small video game using the virtual world map they build and rendering in assets and such from there. It's meant to give you a reasonable look into what the car sees and might do, but they've confirmed that it is in no way tied to the underlying neural decision network.

  • The blockers for Tesla are that it's processing a 2D input in order to navigate 3D space. They use some ai trickery to make virtual anchor points using image stills and points of time to work around this and get back to a 3D space but the auto industry at large (not me) has collectively agreed this cannot overcome numerous serious challenges in realistic applications (the one people may be most familiar with is Mark Rober's test where the tesla just drives right into a wall painted to look like the road Wiley Coyote style, but this has real world analogs such as complex weather). Lidar and ultrasonics integrated into the chain of trust can (and already do for most adas systems) mitigate a significant portion of the risk this issue causes (Volvo has shown even low resolution "cheap" Lidar sensors without 360 degree coverage can offer most of these benefits). To be honest I'm not certain that the addition would fix everything, perhaps the engineering obstacles really were insurmountable... but from what I hear from the industry at large, my friends in the space, and my own common sense; I don't see how a wholly 2D implementation relying on only camera input can be anything but an insurmountable engineering challenge to overcome in order to produce the final minimal viable product. So from my understanding it'd be like being told you have to use water and only water as your hydraulic fluid, or that you can only use a heat lamp to cook for your restaurant. It's just legitimately unsuitable for the purpose despite giving off the guise of doing the same work.

  • I think of it as a lab because it's my sandbox for me to do crazy server stuff at home that I'd never do on my production network at work, and I think that's why the name stuck, because back when systems were expensive as heck it was pretty much just us sysadmin guys hauling home old gear to mess with.