Science

Europe's fastest supercomputer uses warm water cooling to conserve energy and heat buildings

View 16 Images
Computer rendition of SuperMUC rendered by SuperMUC (Image: Leibniz-Rechenzentrum der Bayerischen Akademie der Wissenschaften)
The new LRZ "SuperMUC" system includes more than 150,000 cores to provide a peak performance of up to three petaflops (Image: IBM Research - Zurich)
The Leibniz Supercomputing Center (LRZ) hosts the world's first commercially available hot-water cooled supercomputer (Image: IBM Research - Zurich)
The new LRZ "SuperMUC" system was built with IBM’s System x® iDataPlex® Direct Water Cooled dx360 M4 which includes more than 150,000 cores to provide a peak performance of up to three petaflops, the equivalent to the work of more than 110,000 personal computers. Put another way, three billion people each using a pocket calculator would have to perform one million operations per second each to reach equivalent SuperMUC performance (Image: IBM Research - Zurich)
The new LRZ "SuperMUC" system provides a peak performance equivalent to the work of more than 110,000 personal computers (Image: IBM Research)
Three billion people each using a pocket calculator would have to perform one million operations per second each to reach equivalent SuperMUC performance (Image: IBM Research - Zurich)
The new LRZ "SuperMUC" system uses warm water instead of air to keep tens of thousands of microprocessors at the optimal operating speed (Image: IBM Research - Zurich)
The new LRZ "SuperMUC" system includes more than 150,000 cores to provide a peak performance of up to three petaflops (Image: IBM Research - Zurich)
Cooling Towers on the rooftop of the Leibniz-Rechenzentrum (Image: IBM Research)
The LRZ is the computer center for Munich's universities and for the Bavarian Academy of Sciences and Humanities (Image: IBM Research)
The air cooled supercomputer room for the previous SGI Altix 4700 (Image: IBM Research)
The LRZ offers a variety of data services, and it provides high-end computing facilities for the scientific community in across Europe (Image: IBM Research)
The new LRZ "SuperMUC" system (Image: IBM Research - Zurich)
The new LRZ "SuperMUC" system (Image: IBM Research - Zurich)
The LRZ is the computer center for Munich's universities and for the Bavarian Academy of Sciences and Humanities (Image: IBM Research)
The new LRZ "SuperMUC" system (Image: IBM Research - Zurich)
Computer rendition of SuperMUC rendered by SuperMUC (Image: Leibniz-Rechenzentrum der Bayerischen Akademie der Wissenschaften)
View gallery - 16 images

An innovative cooling design for SuperMUC, Europe's most powerful supercomputer, will use warm water instead of air to keep tens of thousands of microprocessors at the optimal operating speed and increase peak performance. The system, which is said to cool components 4,000 times more efficiently, will also warm the Leibniz Supercomputing Centre Campus that hosts it during the winter months, generating expected savings of up to US$1.25 million per year.

Cooling down a data center is an expensive task, as it can account for up to 50 percent of the total energy consumption of the center. The figure may be even more taxing considering that SuperMUC will be hosted in Germany, a country that, starting from this year, requires all the electricity consumed by state-funded institutions to come from 100 percent sustainable energy sources.

The innovative cooling technology developed by IBM will help address this problem. Using a design inspired by the circulatory system, it transports water as warm as 45 degrees Celsius (113° F) directly to processors and memory components. The system is 10 times as compact and consumes 40 percent less energy than a comparable air-cooled system.

According to the recently revised Top500 list, SuperMUC's impressive 18,000 energy-efficient Intel Xeon processors make it Europe's fastest supercomputer, clocking 3 petaflops (3 million billion floating point operations per second). That's a long way from the new number one on the list - the IBM Sequoia, which is seven times as fast - but with a performance comparable to that of 100,000 personal computers put together, SuperMUC isn't exactly slow, either.

This impressive number-crunching capacity will be used to aid a number of research projects across Europe, ranging from simulating the blood flow generated by an artificial heart valve to improving our understanding of earthquakes. The SuperMUC system is also connected to powerful visualization systems, including a large stereoscopic power wall and a five-sided immersive artificial virtual-reality environment for visualizing three-dimensional data sets.

The engineering team behind the project is targeting an aggressive reduction in size, saying they can reduce the volume tenfold every five years until, 30 years from now, the entire processing power of the data center will be contained in a form factor the size of a standard desktop computer, with a much higher energy efficiency than it has today.

The project is jointly funded by the German federal government and the state of Bavaria. SuperMUC will be officially inaugurated in July 2012 at the Leibniz Supercomputing Centre in Garching, Germany. The video below explains more about the cooling mechanism.

Source: IBM

View gallery - 16 images
  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
9 comments
windykites
Have they considered the use of a fluid other than water? My first thought was, I hope it doesn't spring a leak! Water is the cheapest liquid, obviously, but maybe a refrigerant could work more efficiently, and would not short circuit in the event of a leak. I am intrigued by the need to use warm water. I thought that circuits work better when they cold. Interesting that the heat exchange is totally localised. Is it really the best way to remove excess heat?
Snake Oil Baron
Do you know what is an even better coolant than warm water? Non-warm water.
Slowburn
re; windykites1
Deionized water does not conduct electricity, evaporates without leaving a residue, and carries a large amount of heat per volume.
The chips are cooled in a parallel to give all the chips equal cooling, and the central heat exchanger used to dump the heat into the buildings environmental system gives even water temperature to the chips and works well with the air handlers.
Snake Oil Baron
I followed the link to a similar Gizmag story and the idea seems to be that warm water saves money because chilling it is expensive. But water usually does not come out of the ground or reservoirs at 60 degrees C so why warm it? If they mean that the water is going to warm up that much in the machines after a few passes so it will be continuously warm it would still be a simple thing to run it through some coiled pipes to cool it before recycling it. I just don't get the reason for using *warm* water.
Dave Hill
Rising energy costs along with increased focus on reducing the environmental impact of large scale datacenter deployments is allowing for some pretty interesting innovation on this front.
NextDC's M1 facility in Melbourne is showing similar ingenuity by having triple power sources (Utility A/C, Natural Gas Gen, Diesel Gen) configured in a N+1 scenario. This will ensure costs are controllable despite carbon taxes, etc.
http://nextdc.com/blog/132-m1-carbon-tax-and-our-tri-gen-plants-for-melbourne.html
Necessity is the mother of invention and there's no necessity like keeping an eye on your hip pocket.
FredExII
The idea is economy here. It does not pay to spend millions on chilling when the performance gain for that set-up would not be like a home system. This type of system needs far less cooling to maintain the "circulatory system" at the needed temperature. They gain more savings in the cold weather when they can use the waste heat to heat other buildings. It is a balance. Sure they could chill the system, as I said for millions, for a minimal performance gain and use the waste heat for heating other buildings, but the cost of chilling is not near to being offset by the money saved heating those buildings.
Slowburn
It has been my experience that refrigerating the computer case does not make the chips faster it just creates a larger temperature differential that speeds the heat transfer out of the chip allowing the chips to be overclocked without overheating. Water being a much better coolant than air does not need as large of a heat differential to provide the same level of heat removal so warm water can provide the same amount of cooling as chilled air.
MQ
Maybe they should have made use of the term Optimal... Even cooling the Chips to 100 Degrees C, as long as they are maintained at a constant temperature all of the excess energy is removed.. therefore the is no risk of overheating... (Depending on the max design temperature)
(IBM states "the world's first commercially available hot-water cooled supercomputer", so the qualifiers are "Hot water", and "super computer", there are water cooled server farms already..
Not a lot New... (IBM have been using water cooling for several years... customers just need to catch on) Quote from 2007...."Water cooling is both more efficient than air cooling and can handle higher heat loads, simply because water is far more conductive of heat and has much higher thermal mass than air. It's been slow to catch on because administrators are paranoid about leaks (water and electronics certainly don't mix well), but systems are available now that have been proven reliable. IBM and HP have water-cooled server racks, and Knurr's even won a design award." http://www.ecogeek.org/content/view/1140/71/ )
They didn't say that the water is heated using external energy... a few passes through the system and the water will be hot. Thermostatically controlling the hot side (as in car radiators, only using liquid to liquid exchangers with a large thermal pool for sinking the heat) Using the waste heat to heat buildings is another logical step, (I hope no new patents have been granted for that..., even generating electricity from low-grade heat sources can be improved upon...)
People who know nothing about thermodynamics will just go "Wow"...
Its all about energy removal.... Even the turbulence of the coolant in the pipes affects the transfer rate... Pretty much the energy transport will usually be the "slowest part of the system", not the conductance through the walls of the heat sinking channels... Higher turbulence increases the convection to remove the heat.... Low turbulence gives a slower transfer rate, but longer pipe / channel life)
Probably take home, is that Hot water controls the temperature of the chips better than expensive cold air...
Brian H
Notice the mention of massive further increases in efficiency as they shrink the whole design. As the channels (capillaries?) get smaller, the surface area increases, speeding heat transfer rates.