Environment

Internet could lower its cooling bills by using hot water

Internet could lower its cooling bills by using hot water
Two of the microchannel hot water heat sinks, on a server blade from IBM/ETH's Aquasar supercomputer
Two of the microchannel hot water heat sinks, on a server blade from IBM/ETH's Aquasar supercomputer
View 4 Images
The IBM/ETH Aquasar supercomputer concept
1/4
The IBM/ETH Aquasar supercomputer concept
IBM Zurich researchers in front of the Aquasar
2/4
IBM Zurich researchers in front of the Aquasar
Two of the microchannel hot water heat sinks, on a server blade from IBM/ETH's Aquasar supercomputer
3/4
Two of the microchannel hot water heat sinks, on a server blade from IBM/ETH's Aquasar supercomputer
A diagram of how the Aquasar supercomputer will work within the ETH campus
4/4
A diagram of how the Aquasar supercomputer will work within the ETH campus
View gallery - 4 images

It’s easy to think of the Internet as something that’s just “out there” in cyberspace, that doesn’t effect the physical world in any tangible way. In 2009, however, it was estimated that Internet data centers worldwide consumed about 2% of global electricity production. Not only did most of that electricity undoubtedly come from non-green sources, but it also cost the global economy approximately 30 billion US dollars. Much of the electricity was needed to power the data centers’ forced air cooling systems, that keep the servers from overheating. Now, researchers from IBM Zurich and the Swiss Federal Institute of Technology Zurich (ETH) have devised a much more efficient method for cooling the steamy Internet - they use hot water.

Why water?

Liquid cooling is by nature a much more effective cooling method, as the heat capacity of water is over 4,000 times that of air. Also, once the heat is transferred to the water, it can be handled more efficiently. In IBM/ETH’s model, the server-heated water could even go on to provide heat for the local community.

But why HOT water?

Chilled water has been used to cool mainframes, and it certainly does the job, but there’s a catch - chilling that water requires a lot of electricity. The Swiss process uses water that’s at 60-70C (140-158F), which is still cool enough to keep the servers’ chips below their “red line” of 85C.

How it works

Computers and many other electrical devices dissipate heat using something called a heat sink. Heat sinks look like a row of closley-spaced upright rectangular metal blades, and they work by dramatically increasing the device’s surface area - not unlike an elephant uses its giant ears to increase its own cooling surface area. IBM/ETH’s process uses what they call a microfluidic heat sink. It contains a network of tiny channels which the water is pumped through, absorbing heat from the metal along the way.

The IBM/ETH Aquasar supercomputer concept
The IBM/ETH Aquasar supercomputer concept

A working model

To demonstrate their technology, IBM and ETH are creating a supercomputer called Aquasar, which should be completed this year. Aquasar will be housed on the ETH campus, and will provide heat to its buildings. It will operate as a closed system, so the same water will cool the servers, release their heat into the buildings, then return to the computer to cool it again. It is anticipated that the new system will cut the campus’ computer-cooling carbon footprint by over 85%, and save up to 30 tons of CO2 per year.

Aquasar Supercomputer

View gallery - 4 images
11 comments
11 comments
Skipjack
What about the times of the year, when you do NOT want to heat your rooms and buildings, when you actually need air conditioning? Unless the campus is somewhere high up in the swiss alps...
Chris Maresca
This is pretty much what water cooling geeks have been doing with desktops for ages. Circulate water and cool it through a radiator. Companies like Koolance http://www.koolance.com and FrozenCPU http://www.frozencpu.com have been offering kits for ages and there are whole forums dedicated to this (http://hardforum.com/forumdisplay.php?f=91 or http://www.overclockers.com/forums/forumdisplay.php?f=71)
christopher
Implausible??. 2% of *global* electricity? Only the datacenters?? The USA alone made around 4000 billion kWh in 2006. Datacenters charge by the amp - usually 1 amp for a regular server, 2 amps for a grunty megaserver - that\'s between 110 and 220 Wh, or somewhere between 1000 and 2000 kWh per year.
Power going into computers turns almost directly into heat, airconditioners are \"heat pumps\" (eg: heat is moved from inside the center to outside, which takes a lot less energy than the amount of heat moved). If we conservatively guesstimate 50% less (and I\'m probably overstimating by an order of magnitude) then...
For that 2% number to be true, there would have had to be more than 40 million servers in datacenters in the USA alone in 2006 - that\'s more than 1 server for every 7 people in the USA.
I\'m sure there\'s a lot of servers in the USA, but 40 million? *all* in datacenters??
I doubt.
jmdelrio1
As much as I love this idea, and I would like to see watercooling on all machines, how do they expect to handle the volume of hot water needed by buildings near by or a community\'s? THose are tiny tiny tubes comming out of that server. :)
gormanwvzb
One place to start is using green roofs in combination with PV cells on data center roofs. (http://cleanerairforcities.blogspot.com/2008/07/data-centers-need-green-roofs.html) Not only would the HVAC bill be reduced by 20-30 %, but the PV cells would work more efficiently with a cooler roof. (http://cleanerairforcities.blogspot.com/2009/06/combining-solar-and-green-roofs.html).
domhnall
@Skipjack: Presumably the same water used for cooling airconditioning would still be cool enough to cool the CPUs.
Eric Malatji
what about the building up of impurities, they would need something to remove them before the water gets into the computers as, building such a system and having to clean is would be a huge task.
donwine
The whole point is to get rid of heat without using energy. If you use water that has been chilled - it would take energy to do it. They merely found a larger radiator - homes that use heat. The real challenge is to convert heat to cold without using energy. Think of the market - no AC, cooling cars and industry. The planet earth is the best example. It absorbs heat which rises then cools to form rain and in return cools the planet. No other celestial object does this. Our tools are electricity, magnetism and water and yet we can not copy the design of our planet.
cloa513
To christopher
They did say global 2%, that includes China, Europe, Japan, North America ......
China\'s usage on data centres would a lot more given they have a limited numbers of middle class and want to develop a lot more rapid and don\'t care about the cost or whether they are built to be energy efficient.
Brian Hall
The CO2 stuff is just trendy distraction. It's the old principle: a heat pump moves energy from unwanted to wanted areas efficiently. That's all that needed to be said.
CO2 is beneficial, but all mankind's output is trivial noise in the ocean-controlled flux in the atmosphere.
Load More