0
Research Papers

Data Center Housing High Performance Supercomputer Cluster: Above Floor Thermal Measurements Compared To CFD Analysis

[+] Author and Article Information
Roger Schmidt, Madhusudan Iyengar

 IBM Systems and Technology Group, 2455 South Road, Poughkeepsie, NY 12601

Joe Caricari

 Grubb and Ellis, 2455 South Road, Poughkeepsie, NY 12601

J. Electron. Packag 132(2), 021009 (Jun 25, 2010) (8 pages) doi:10.1115/1.4001871 History: Received August 13, 2008; Revised May 20, 2010; Published June 25, 2010; Online June 25, 2010

With the ever increasing heat dissipated by information technology (IT) equipment housed in data centers, it is becoming more important to project the changes that can occur in the data center as the newer higher powered hardware is installed. The computational fluid dynamics (CFD) software that is available has improved over the years. CFD software specific to data center thermal analysis has also been developed. This has improved the time lines of providing some quick analysis of the effects of new hardware into the data center. But it is critically important that this software provide a good report to the user of the effects of adding this new hardware. It is the purpose of this paper to examine a large cluster installation and compare the CFD analysis with environmental measurements obtained from the same site. This paper shows measurements and CFD data for high powered racks as high as 27 kW clustered such that heat fluxes in some regions of the data center exceeded 700 W per square foot. This paper describes the thermal profile of a high performance computing cluster located in an data center and a comparison of that cluster modeled via CFD. The high performance advanced simulation and computing (ASC) cluster had a peak performance of 77.8 TFlop/s, and employed more than 12,000 processors, 50 Tbytes of memory, and 2 Pbytes of globally accessible disk space. The cluster was first tested in the manufacturer’s development laboratory in Poughkeepsie, New York, and then shipped to Lawrence Livermore National Laboratory in Livermore, California, where it was installed to support the national security mission of the U.S. Detailed measurements were taken in both data centers and were previously reported. The Poughkeepsie results will be reported here along with a comparison to CFD modeling results. In some areas of the Poughkeepsie data center, there were regions that did exceed the equipment inlet air temperature specifications by a significant amount. These areas will be highlighted and reasons given on why these areas failed to meet the criteria. The modeling results by region showed trends that compared somewhat favorably but some rack thermal profiles deviated quite significantly from measurements.

FIGURES IN THIS ARTICLE
<>
Copyright © 2010 by American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.

References

Figures

Grahic Jump Location
Figure 1

Datacom equipment power density chart used with permission from ASHRAE (3)

Grahic Jump Location
Figure 2

Poughkeepsie data center for housing supercomputer cluster

Grahic Jump Location
Figure 3

Poughkeepsie data center—region A

Grahic Jump Location
Figure 4

Poughkeepsie data center—region B

Grahic Jump Location
Figure 5

CFD model of the above floor regions

Grahic Jump Location
Figure 6

Comparisons of the CFD to the measurements—regions 1–4

Grahic Jump Location
Figure 7

Comparisons of the CFD to the measurements—regions 5–7

Grahic Jump Location
Figure 8

Influence of the air flow rate to IT power ratio on the average regional server inlet temperature

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In