0
Research Papers

From Chip to Cooling Tower Data Center Modeling: Chip Leakage Power and Its Impact on Cooling Infrastructure Energy Efficiency

[+] Author and Article Information
Thomas J. Breen

e-mail: thomas.breen@ul.ie

Jeff Punch

Stokes Institute,
University of Limerick,
Limerick, Ireland

Niru Kumari

Hewlett-Packard Laboratories,
Palo Alto, CA 94304

Tahir Cader

HP Enterprise Business,
Liberty Lake, WA 99019

Contributed by the Electronic and Photonic Packaging Division of ASME for publication in the Journal of Electronic Packaging. Manuscript received June 5, 2012; final manuscript received August 27, 2012; published online November 26, 2012. Assoc. Editor: Siddharth Bhopte.

J. Electron. Packag 134(4), 041009 (Nov 26, 2012) (8 pages) doi:10.1115/1.4007744 History: Received June 05, 2012; Revised August 27, 2012

The power consumption of the chip package is known to vary with operating temperature, independently of the workload processing power. This variation is commonly known as chip leakage power, typically accounting for ~10% of total chip power consumption. The influence of operating temperature on leakage power consumption is a major concern for the information technology (IT) industry for design optimization where IT system power densities are steadily increasing and leakage power expected to account for up to ~50% of chip power in the near future associated with the reducing package size. Much attention has been placed on developing models of the chip leakage power as a function of package temperature, ranging from simple linear models to complex super-linear models. This knowledge is crucial for IT system designers to improve chip level energy efficiency and minimize heat dissipation. However, this work has been focused on the component level with little thought given to the impact of chip leakage power on entire data center efficiency. Studies on data center power consumption quote IT system heat dissipation as a constant value without accounting for the variance of chip power with operating temperature due to leakage power. Previous modeling techniques have also omitted this temperature dependent relationship. In this paper, we discuss the need for chip leakage power to be included in the analysis of holistic data center performance. A chip leakage power model is defined and its implementation into an existing multiscale data center energy model is discussed. Parametric studies are conducted over a range of system and environment operating conditions to evaluate the impact of varying degrees of chip leakage power. Possible strategies for mitigating the impact of leakage power are also illustrated in this study. This work illustrates that when including chip leakage power in the data center model, a compromise exists between increasing operating temperatures to improve cooling infrastructure efficiency and the increase in heat load at higher operating temperatures due to leakage power.

FIGURES IN THIS ARTICLE
<>
Copyright © 2012 by ASME
Your Session has timed out. Please sign back in to continue.

References

Scaramella, J., 2008, “Next-Generation Power and Cooling for Blade Environments,” IDC Technical Report No. 215675.
Koomey, J. G., 2007, “Estimating Total Power Consumption by Servers in the U.S. and the World,” Stanford University, Stanford, CA.
Shah, A., Patel, C., Bash, C., Sharma, R., and Shih, R., 2008, “Impact of Rack-Level Compaction on the Data Center Cooling Ensemble,” Intersociety Conference on Thermal and Thermomechanical Phenomena in Electronic Systems, Orlando, FL, May 28–31.
Breen, T. J., Walsh, E. J., Punch, J., Shah, A. J., and Bash, C. E., 2011, “From Chip to Cooling Tower Data Center Modeling: Influence of Server Inlet Temperature and Temperature Rise Across Cabinet,” ASME J. Electron. Packag., 133(1), p. 011004. [CrossRef]
Walsh, E. J., Breen, T. J., Punch, J., Shah, A. J., and Bash, C. E., 2011, “From Chip to Cooling Tower Data Center Modeling: Influence of Chip Temperature Control Philosophy,” ASME J. Electron. Packag., 133(3), p. 031008. [CrossRef]
Breen, T. J., Walsh, E. J., Punch, J., Shah, A. J., Bash, C. E., Rubenstein, B., Heath, S., and Kumari, N., 2011, “From Chip to Cooling Tower Data Centre Modelling: Influence of Air-Stream Containment on Operating Efficiency,” ASME/JSME 8th Thermal Engineering Joint Conference, Honolulu, HI, March 13–17, ASME Paper No. AJTEC 2011. [CrossRef]
Shah, A. J., Carey, V. P., Bash, C. E., and Patel, C. D., 2006, “An Exergy-Based Figure-of-Merit for Electronic Packages,” ASME J. Electron. Packag., 128(4), pp. 360–369. [CrossRef]
Patel, C. D., Sharma, R. K., Bash, C. E., and Beitelmal, M., 2006, “Energy Flow in the Information Technology Stack: Coefficient of Performance of the Ensemble and Its Impact on the Total Cost of Ownership,” Hewlett-Packard Laboratories, Palo Alto, CA, Technical Report No. HPL-2006-55.
Iyengar, M., and Schmidt, R., 2009, “Analytical Modeling for Thermodynamic Characterization of Data Center Cooling Systems,” ASME J. Electron. Packag., 131(2), p. 021009. [CrossRef]
Su, H., Liu, F., Devgan, A., Acar, E., and Nassif, S., 2003, “Full Chip Leakage-Estimation Considering Power Supply and Temperature Variations,” Proceedings of the 2003 International Symposium on Low Power Electronics and Design (ISLPED’03), Seoul, Korea, August 25–27, pp. 78–83.
Dolezalek, H., 2009, “Data Center Temperature Know the Benefits & Risks of Bumping Up The Heat,” Processor, 31(9), p. 38.
Fontecchio, M., 2007, “ASHRAE to Expand Recommended Data Center Humidity, Temp Ranges,” available at http://searchdatacenter.techtarget.com/news/1282924/ASHRAE-to-expand-recommended-data-center-humidity-temp-ranges
Huang, H., Gang, Q., and Fan, J., 2010, “Leakage Temperature Dependency Modeling in System Level Analysis,” 11th International Symposium on Quality Electronic Design (ISQED), San Jose, CA, March 22–24, pp. 447–452. [CrossRef]
Green Grid Industry Consortium, 2007, “Green Grid Metrics Describing Data Center Power Efficiency,” Technical Committee White Paper, available at http://www.thegreengrid.org/~/media/WhitePapers/Green_Grid_Metrics_WP.pdf?lang=en
Shah, A., Bash, C., Kumari, N., Cader, T., Breen, T., Walsh, E., and Punch, J., 2011, “On the Need for Energy Efficiency Metrics that Span Integrated IT-Facility Infrastructures,” ASME 2011 Pacific Rim Technical Conference and Exhibition on Packaging and Integration of Electronic and Photonic Systems (InterPACK2011), Portland, OR, July 6–8, 2011, ASME Paper No. IPACK2011-52140, pp. 541–551. [CrossRef]
Jaewook, Y., Kiwon, C., and Sayoon, K., 2007, “A Junction Temperature Reduction Technique for a Microprocessor Considering Temperature Coupled Leakage Power,” 23rd Annual IEEE Semiconductor Thermal Measurement and Management Symposium (SEMI-THERM 2007), San Jose, CA, March 18–22, pp. 74–78. [CrossRef]
Liu, Y., Dick, R. P., Shang, L., and Yang, H., 2007, “Accurate Temperature-Dependent Integrated Circuit Leakage Power Estimation is Easy,” Proceedings of the Conference on Design, Automation and Test in Europe (DATE’07), EDA Consortium, Nice, France, April 16–20, pp. 1526–1531. [CrossRef]
Min, N., and Memik, S. O., 2006, “Thermal-Induced Leakage Power Optimization by Redundant Resource Allocation,” IEEE/ACM International Conference on Computer-Aided Design (ICCAD'06), San Jose, CA, November 5–9, pp. 297–302. [CrossRef]

Figures

Grahic Jump Location
Fig. 1

Schematic of the data center cooling infrastructure

Grahic Jump Location
Fig. 2

Breakdown of chip power dissipation to processing power and leakage power illustrating the leakage gradient

Grahic Jump Location
Fig. 3

Data center IT power usage for fixed heat sink temperatures with THS,Leak0 = 20 °C and increasing leakage gradient. (a) THS50 °C, LX0.0W/°C. (b) THS50 °C, increasing LX. (c) Full range of THS and LX.

Grahic Jump Location
Fig. 4

Data center cooling infrastructure power usage for fixed heat sink temperatures (color coded) with THS,Leak0 = 20 °C and increasing leakage gradient

Grahic Jump Location
Fig. 5

Total data center facility power usage for fixed heat sink temperatures (color coded) with THS,Leak0 = 20 °C and increasing leakage gradient

Grahic Jump Location
Fig. 6

Breakdown of total data center facility power at 20 °C rack inlet temperature. (a) THS50 °C, LX0.0W/°C. (b) THS50 °C, LX1.0W/°C. (c) THS70 °C, LX0.0W/°C. (d) THS70 °C, LX1.0W/°C.

Grahic Jump Location
Fig. 7

Impact of the range of operating conditions on COP* for fixed heat sink temperatures (color coded) with THS,Leak0 = 20 °C and increasing leakage gradient

Grahic Jump Location
Fig. 8

Peak COP* for each LX across THS range for fixed heat sink temperatures with THS,Leak0 = 20 °C

Grahic Jump Location
Fig. 9

COP* values for linearly variable heat sink temperature with inlet air. THS,Leak0 = 20 °C and THSRef of 50 °C at 20 °C inlet air

Grahic Jump Location
Fig. 10

COP* values for linearly variable heat sink temperature with inlet air. THS,Leak0 = 20 °C and THSRef of 50 °C at 20 °C inlet air

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In