Modern AI data centers consume enormous amounts of power, and it looks like they will get even more power-hungry in the coming years as companies like Google, Microsoft, Meta, and OpenAI strive towards artificial general intelligence (AGI). Oracle has already outlined plans to use nuclear power plants for its 1-gigawatt datacenters. It looks like Microsoft plans to do the same as it just inked a deal to restart a nuclear power plant to feed its data centers, reports Bloomberg.
Even if it didn’t improve further there are still uses for LLMs we have today. That’s only one kind of AI as well, the kind that makes all the images and videos is completely separate. That has come on a long way too.
I made this chart for you:
------ Expectations for AI
----- LLM’s actual usefulness
----- What I think if it
----- LLM’ usefulness after accounting for costs
Bruh you have no idea about the costs. Doubt you have even tried running AI models on your own hardware. There are literally some models that will run on a decent smartphone. Not every LLM is ChatGPT that’s enormous in size and resource consumption, and hidden behind a vail of closed source technology.
Also that trick isn’t going to work just looking at a comment. Lemmy compresses whitespace because it uses Markdown. It only shows the extra lines when replying.
Can I ask you something? What did Machine Learning do to you? Did a robot kill your wife?
Earlier this year, the International Energy Agency released its energy usage and forecast and has predicted that the total global electricity consumption of data centers is set to top 1 PWh (petawatt-hour) in 2026. This more than doubles its 2022 value and (as the report states) “is equivalent to the electricity consumption of Japan.” SOURCE
It does fuck all for me except make art and customer service worse on average, but yes it certainly will result in countless avoidable deaths if we don’t heavily curb its usage soon as it is projected to Quintuple its power draw by 2029.