While the mass adoption of AI has transformed digital life seemingly overnight, regulators have fallen asleep on the job in curtailing AI data centers’ drain on energy and water resources.
Google ghg emissions in 2023 are 14.3 million metric tons. Which are a ridiculous percentage of global emissions.
Commercial aviation emissions are 935.000 million metric tons by year.
So IDK about plastic straws or google. But really if people stopped flying around so much that would actually make a dent on global emissions.
Don't get me wrong, google is a piece of shit. But they are not the ones causing climate change, neither is AI technology. Planes, cars, meat industry, offshore production... Those are some of the truly big culprits.
The annoying part is how many mainstream tech companies have ham-fisted AI into every crevice of every product. It isn't necessary and I'm not convinced it results in a "better search result" for 90% of the crap people throw into Google. Basic indexed searches are fine for most use cases.
With adblock enabled I feel like their results are often better than for example Duckduckgo. I recently switched to using DDG as my standard search engine but I regularly find myself using Google instead to get the results I'm looking for.
Interesting, I'm actually the exact opposite. I always start with Google, because it's usually good enough, but whenever it takes 2-3 tries to get something relevant, I switch to ddg and get it first try.
AI is just what crypto bros moved onto after people realized that was a scam. It's immature technology that uses absurd amounts of energy for a solution in search of a problem, being pushed as the future, all for the prospect of making more money. Except this time it's being backed by major corporations because it means fewer employees they have to pay.
There are legitimate uses of AI in certain fields like medical research and 3D reconstruction that aren't just a scam. However, most of these are not consumer facing and the average person won't really hear about them.
It's unfortunate that what you said is very true on the consumer side of things...
I would love to see an AI make an ANSYS model that isn’t shit. They might be able to make cute pictures, but when it comes to making models for CFD or FEA, AI is a complete waste of time.
I skimmed the article, but it seems to be assuming that Google's LLM is using the same architecture as everyone else. I'm pretty sure Google uses their TPU chips instead of a regular GPU like everyone else. Those are generally pretty energy efficient.
That and they don't seem to be considering how much data is just being cached for questions that are the same. And a lot of Google searches are going to be identical just because of the search suggestions funneling people into the same form of a question.
I hadn't really heard of the TPU chips until a couple weeks ago when my boss told me about how he uses USB versions for at-home ML processing of his closed network camera feeds. At first I thought he was using NVIDIA GPUs in some sort of desktop unit and just burning energy...but I looked the USB things up and they're wildly efficient and he says they work just fine for his applications. I was impressed.
The Coral is fantastic for use cases that don't need large models. Object recognition for security cameras (using Blue Iris or Frigate) is a common use case, but you can also do things like object tracking (track where individual objects move in a video), pose estimation, keyphrase detection, sound classification, and more.
It runs Tensorflow Lite, so you can also build your own models.
The Coral ones? They don't have nearly enough RAM to handle LLMs - they only have 8MB RAM and only support small Tensorflow Lite models.
Google might have some custom-made non-public chips though - a lot of the big tech companies are working on that.
instead of a regular GPU
I wouldn't call them regular GPUs... AI use cases often use products like the Nvidia H100, which are specifically designed for AI. They don't have any video output ports.
The confounding part is that when I do get offered an "AI result", it's basically identical to the excerpt in the top "traditional search" result. It wasted a fair amount more time and energy to repeat what the top of the search said anyway. I've never seen the AI overview ever be more useful than the top snippet.
I don’t know if lemmians (lemmings?) live in the same world I live in; I find AI a HUGE productivity boost, in search, writing, generation, research.
Of course one has to check results carefully, it won’t remove the need for quality checks and doing stuff yourself. But as a sparring partner, an idea creator and an assistant, I am convinced people who make use of Claude/GPT etc will outperform people who don’t by a wide margin.
Ai devinetively has its use cases and boosts productivity if used right. The stuff google did is just the most bullshit ever seen. Its an example of useless Ai because of the "need" to use Ai.
I find LLM’s to be entirely worthless in any kind of engineering analysis or scientific reference. It has done nothing but hinder the design process and we have moved entirely away from it as it’s a complete joke.
It’s more that there is a vocal minority against it. I’d guess most of us are mostly neutral about it, we see the problems and the benefits but don’t see the need to comment everywhere about our feelings towards it.
If only they did what DuckDuckGo did and made it so it only popped up in very specific circumstances, primarily only drawing from current summarized information from Wikipedia in addition to its existing context, and allowed the user to turn it off completely in one click of a setting toggle.
I find it useful in DuckDuckGo because it's out of the way, unobtrusive, and only pops up when necessary. I've tried using Google with its search AI enabled, and it was the most unusable search engine I've used in years.
In fact, I regularly use their anonymized LLM Chat tab to help out with restructuring data, summarizing some more complex topics, or finding some info that doesn't readily appear near the top of search. It's made my search experience (again, specifically in my circumstance) much better than before.
It's hidden in the sense that the normal user does not see the true cost on their energy bill. You perform a search and get the result in milliseconds. That makes it easy to get the false impression that it's just a minor operation. It's not like driving a car and watching the the fuel gauge and see the consumption.
Of course one can research how much energy Google consumes and find out the background – IF you're interested. But most people just use tech and do not question or even understand.
For you it might be clear. For the overwhelming majority of people, this is news. People don't know shit about tech. Most would assume the AI thingy does its thing on the local computer.
Most would assume the AI thingy does its thing on the local computer.
good thing nvidia having a massive market cap isn't a big news item, it's worth noting that nobody is talking about all the hardware that microsoft and google are building, surely nobody will talk about the fact that nvidia is backordered for years on hardware.
It's even better that modern consumer grade CPUs aren't marketing for the NPU bits they have.
Or there are bigger fish to fry like coal power plants and wasted idle etc. shit Google's AI search results are wasteful in some cases but I've found it to actually improve my experience but I use a mix of search engines. You can always use one without AI or turn it off. It's being pushed into almost every search though which I really don't think is useful, and it would be best if they cached the answers instead of regeneration on command. They may cache some though.
As long as we are talking about crypto, and I know this is shocking, turns out some people are using it for unsavory acts. It is okay if you want to sit down.
To be fair, it was never "hidden" since all the top 5 decided that GPU was the way to go with this monetization.
Guess who is waiting on the other side of this idiocy with a solution? AMD with cheap FPGA that will do all this work at 10x the speed and similar energy reduction. At a massive fraction of the cost and hassle for cloud providers.
I'm genuinely curious where their penny picking went? All of tech companies shove ads into our throats and steal our privacy justifying that by saying they operate at loss and need to increase income. But suddenly they can afford spending huge amounts on some shit that won't give them any more income. How do they justify it then?
It's another untapped market they can monopolize. (Or just run at a loss because investors are happy with another imaginary pot of gold at the end of another rainbow.)
This is terrible. Why don't we build nuclear power plants, rollout a carbon tax, and put incentives for companies to make their own energy via renewables?
You know the shit that we should have been doing before I was born.
I was talking about the photonic chips from Eindhoven University of Technology in the Netherlands, but looking into it the last update on that was four years ago.
I know they were entangling photons, so there's probably lots of research and parts to work on, but still, maybe it's just not moving forwards.
I'm not that familiar with the technology, but the benefits seem worthy of more news than they get.
I'm surprised it's only 10x. Running a prompt though a llm takes quite a bit of energy, so I guess even the regular searches take more energy than I thought.
Same. I think I've read that a single GPT-4 instance runs on a 128 GPU cluster, and ChatGPT can still take something like 30s to finish a long response. A H100 GPU has a TDP of 700w. Hard to believe that uses only 10x more energy than a search that takes milliseconds.
I wonder what the power consumption of getting to the information in the summary is as a whole when using a regular search, clicking on multiple links, finding the right information and extracting the relevant parts. Including the expenditures of energy by the human performing the task and everything that surrounds the activity.
There are real concerns surrounding AI, I wonder if this is truly one of them or if it’s just poorly researched ragebait.
I would point out that Google has been "carbon neutral" with it's data centers for quite some time, unlike others who still rape the environment ahem AWS.
The one article that’s popped up in months of nothing. There was a larger response from people when it was something they saw as worthless using large quantities of power compared to the “helpful ai” that’s consuming the power now.
Shows that it never was about the environment in the first place. Hypocrites.
It's a different debate because bitcoiners lose literally nothing to green power laws and banning coal. AI users would see a reduction in output quality/speed.