I always thought data centers ran clean and dirty loops of cooling (as far as computers are concerned).
The clean loop has all the chemicals and whatnot to keep cooling blocks and tubing “safe”. The dirty side is just plain old water. And a big heat exchanger transfers the heat from the clean (hot) loop to the “dirty” (cold) side.
Is there really that much pollution in that? Can’t be worse than rain going through storm drains or whatever.
But AI does use a phenomenal amount of power.
And, IMO, it’s a problem compared to the lack of value people are getting from AI.
The new Blackwell B200 consumes 1.2kw of power, and will produce 1.2kw of heat.
A cooling system with a COP of 5 needs to consume 240w to dissipate this.
The backplane for the B200 holds 8 of these GPUs in a 10 RU space, and with overheads will peak 14.3kw (cooling would be 3kw consumption).
So, a 42u data center rack with 3 of these, supporting hardware and UPS efficiencies (80%) is going to be 52kw (+10kw cooling). 62kw total, which is like 4 homes drawing their full load all the time.
I hope they finally find an application for AI, instead of just constantly chasing the dragon with more training, more parameters, more performance etc.