Internet could lower its cooling bills by using hot water


April 16, 2010

Two of the microchannel hot water heat sinks, on a server blade from IBM/ETH's Aquasar supercomputer

Two of the microchannel hot water heat sinks, on a server blade from IBM/ETH's Aquasar supercomputer

Image Gallery (4 images)

It’s easy to think of the Internet as something that’s just “out there” in cyberspace, that doesn’t effect the physical world in any tangible way. In 2009, however, it was estimated that Internet data centers worldwide consumed about 2% of global electricity production. Not only did most of that electricity undoubtedly come from non-green sources, but it also cost the global economy approximately 30 billion US dollars. Much of the electricity was needed to power the data centers’ forced air cooling systems, that keep the servers from overheating. Now, researchers from IBM Zurich and the Swiss Federal Institute of Technology Zurich (ETH) have devised a much more efficient method for cooling the steamy Internet - they use hot water.

Why water?

Liquid cooling is by nature a much more effective cooling method, as the heat capacity of water is over 4,000 times that of air. Also, once the heat is transferred to the water, it can be handled more efficiently. In IBM/ETH’s model, the server-heated water could even go on to provide heat for the local community.

But why HOT water?

Chilled water has been used to cool mainframes, and it certainly does the job, but there’s a catch - chilling that water requires a lot of electricity. The Swiss process uses water that’s at 60-70C (140-158F), which is still cool enough to keep the servers’ chips below their “red line” of 85C.

How it works

Computers and many other electrical devices dissipate heat using something called a heat sink. Heat sinks look like a row of closley-spaced upright rectangular metal blades, and they work by dramatically increasing the device’s surface area - not unlike an elephant uses its giant ears to increase its own cooling surface area. IBM/ETH’s process uses what they call a microfluidic heat sink. It contains a network of tiny channels which the water is pumped through, absorbing heat from the metal along the way.

A working model

To demonstrate their technology, IBM and ETH are creating a supercomputer called Aquasar, which should be completed this year. Aquasar will be housed on the ETH campus, and will provide heat to its buildings. It will operate as a closed system, so the same water will cool the servers, release their heat into the buildings, then return to the computer to cool it again. It is anticipated that the new system will cut the campus’ computer-cooling carbon footprint by over 85%, and save up to 30 tons of CO2 per year.
About the Author
Ben Coxworth An experienced freelance writer, videographer and television producer, Ben's interest in all forms of innovation is particularly fanatical when it comes to human-powered transportation, film-making gear, environmentally-friendly technologies and anything that's designed to go underwater. He lives in Edmonton, Alberta, where he spends a lot of time going over the handlebars of his mountain bike, hanging out in off-leash parks, and wishing the Pacific Ocean wasn't so far away. All articles by Ben Coxworth

What about the times of the year, when you do NOT want to heat your rooms and buildings, when you actually need air conditioning? Unless the campus is somewhere high up in the swiss alps...


This is pretty much what water cooling geeks have been doing with desktops for ages. Circulate water and cool it through a radiator. Companies like Koolance and FrozenCPU have been offering kits for ages and there are whole forums dedicated to this ( or

Chris Maresca

Implausible??. 2% of global electricity? Only the datacenters?? The USA alone made around 4000 billion kWh in 2006. Datacenters charge by the amp - usually 1 amp for a regular server, 2 amps for a grunty megaserver - that\'s between 110 and 220 Wh, or somewhere between 1000 and 2000 kWh per year.

Power going into computers turns almost directly into heat, airconditioners are \"heat pumps\" (eg: heat is moved from inside the center to outside, which takes a lot less energy than the amount of heat moved). If we conservatively guesstimate 50% less (and I\'m probably overstimating by an order of magnitude) then...

For that 2% number to be true, there would have had to be more than 40 million servers in datacenters in the USA alone in 2006 - that\'s more than 1 server for every 7 people in the USA.

I\'m sure there\'s a lot of servers in the USA, but 40 million? all in datacenters??

I doubt.


As much as I love this idea, and I would like to see watercooling on all machines, how do they expect to handle the volume of hot water needed by buildings near by or a community\'s? THose are tiny tiny tubes comming out of that server. :)


One place to start is using green roofs in combination with PV cells on data center roofs. ( Not only would the HVAC bill be reduced by 20-30 %, but the PV cells would work more efficiently with a cooler roof. (


@Skipjack: Presumably the same water used for cooling airconditioning would still be cool enough to cool the CPUs.


what about the building up of impurities, they would need something to remove them before the water gets into the computers as, building such a system and having to clean is would be a huge task.

Eric Malatji

The whole point is to get rid of heat without using energy. If you use water that has been chilled - it would take energy to do it. They merely found a larger radiator - homes that use heat. The real challenge is to convert heat to cold without using energy. Think of the market - no AC, cooling cars and industry. The planet earth is the best example. It absorbs heat which rises then cools to form rain and in return cools the planet. No other celestial object does this. Our tools are electricity, magnetism and water and yet we can not copy the design of our planet.


To christopher They did say global 2%, that includes China, Europe, Japan, North America ...... China\'s usage on data centres would a lot more given they have a limited numbers of middle class and want to develop a lot more rapid and don\'t care about the cost or whether they are built to be energy efficient.


The CO2 stuff is just trendy distraction. It's the old principle: a heat pump moves energy from unwanted to wanted areas efficiently. That's all that needed to be said.

CO2 is beneficial, but all mankind's output is trivial noise in the ocean-controlled flux in the atmosphere.

Brian Hall

The whole point of this idea is saving money by saving electricity use. All the power it needs are for the pumps. The trick they used here is creating a very large radiator to dissipate heat to achieve a high efficiency. The efficiency being the amount of power in versus the heat that's going out. The same principle is being more used in homes, where the hot water can be of a lower temperature than before for the same heating demand thanks to a large radiator, resulting in energy savings.

A heat pump in comparison needs more power for its compressor. That and other cooling methods are not going to achieve similar cost savings.

Fretting Freddy the Ferret pressing the Fret
Post a Comment

Login with your Gizmag account:

Related Articles
Looking for something? Search our articles