Transforming IT Blog
Join us in the Transforming IT HP Blog where we will discuss reinventing IT to overcome obstacles and take advantage of Instant on Enterprise opportunities.

LEED for Data Centers: Finally a Reality?


Many people have been inquiring on the status of the LEED for data centers standard that began as a project funded by the California Energy Commission and managed by Lawrence Berkeley National Labs. See for additional information on that effort.


Well the good news is that the US Green Building Council ( has embarked on a process that they call adaptation of the LEED standard 3.0. Simply put, under this process, credits are modified for a particular type of building that does not work well under the LEED standards due to the special nature of the occupancy, equipment, size, etc. This is a summary of the rules:


Credit Modification

  • Changes are proposed based on the current balloted credit language

  • The overall intent of credit cannot be altered

  • Only 6 modifications can be made (regardless of the number of rating systems being addressed)

  • 1 of the 6 modifications can be adoption of a “new” credit from another rating system

  • Entirely new concepts/credits to LEED are not permitted

  • No modifications are allowed to the minimum project and minimum eligibility requirements (though we will note issues for future deliberation/working groups)

  • Proposed modifications must be approved by LEED Steering Committee

  • Changing point values and weightings is not part of this effort

So we are working within the boundaries of the LEED standard for new construction, but are also looking to LEED for Commercial Interiors and LEED for Existing Buildings: Operations and Maintenance to determine the impacts and/or use credits from these standards that make sense for the New Construction standard.


We had our first meeting at the USGBC offices in Washington, DC last month. The team that is working on this effort is as follows:

  • USGBC: four representatives

  • Industry subject matter expert (liaison between working group and USGBC): Bill Kosik – HP

  • Working group: eleven representatives from consulting engineering firms, data center owners, builders, equipment vendors and researchers

The preliminary timeline is to have draft language to the USGBC’s technical and steering committees by late spring 2010. It is not clear yet when a final standard would be made available and/or when pilot projects will be requested.




Using outside air in data centers (AKA "free cooling")

Here are some more analytics and data visualizations, this time on using outside air to cool a data center. When using this type of strategy, the hourly outside temperature and humidity conditions will drive the overall strategy and control of the HVAC systems. This is why it is critical to develop a very granular analysis of the climate for the particular site for the data center.


This is the base case (no economizer).



This is the case using economizer.


No we need to analyze the hourly kW demand of each case, by each sub-system. The following charts were generated from detailed, hourly energy use simulation algorithms developed by HP CFS engineers.


This is the base case. Notice that the blue line which represents the chiller power, is active all year.


This is the economizer case. Here you can see that the chiller power fluctuates much more, reducing the overall power consumption significantly. Notice also that the power for humidification (represented by the purple line) now becomes a larger contributor to the overall energy use. This is where the computer-based simulations become absolutely necessary so we can understand the entire picture before recommending a particular solution.

Will the new presidential order will speed up green IT?

The FEDERAL LEADERSHIP IN ENVIRONMENTAL, ENERGY,AND ECONOMIC PERFORMANCE presidential order went into effect on October 5, 2009. The main goal of the order is to "to establish an integrated strategy towards sustainability in the Federal Government and to make reduction of greenhouse gas emissions a priority for Federal agencies". It requires that "the agency head shall consider reductions associated with: (i) reducing energy intensity in agency buildings; (ii) increasing agency use of renewable energy and implementing renewable energy generation projects on agency property".

There are 3-month and 8-month deadlines for the agencies to put targets and plans in place for reducing their carbon footprint. Could this finally be the push we need?

EPA issues new greenhouse gas reporting requirements

 On September 22, 2009, the EPA Administrator Lisa Jackson signed the rule on a new greenhouse gas emission reporting program which is a pre-cursor to potential carbon taxing policy. The reporting is for the top carbon emitters in the U.S, primarily from the manufacturing sector where carbon emissions are from the site (as opposed to indirect emissions from electrical generation). In this first round of reporting, automobile manufacturers and other large industrial facilities are not included.

If you read the documentation from the EPA you will see that the threshold for report is emissions of 25,000 metrics tons of CO2e (carbon dioxide equivalent) annually. Furthermore, the EPA documents indicate that facilities with a minimum of 25,000 metric tons of CO2e annual emissions account for 85% of the total GHG emissions in the U.S annually. Keep in mind that these are facilities that manufacture items that involve the use of CO2 in the process and often times have on-site power generation facilities using solid or gaseous fuel located on site. These are considered direct emissions since they are from the actual site.

Since most data centers and other large commercial buildings do not typically have on-site electric generation facilities, the CO2e emission from these facilities s are considered indirect emissions since they are attributable to electric generation at an off-site facility owned by another entity. The EPA has issued Technical Support Document on the Proposed Rule for Mandatory Greenhouse Gas Reporting which discusses the benefit of reporting indirect emissions attributable to a facility as a means to encourage awareness of energy use and as a way to understand the impact that the facilities have on the efficiency of the electric power generators.  While this is not part of the new reporting program, it gives insight into the longer-term plans of the U.S. EPA.

The indirect CO2e   emissions from the generation of electricity for a data center is derived from the amount of electricity used expressed in kWH, the fuel source and efficiency of the generation. The eGrid documentation makes this calculation easy. If you know the annual electricity usage of the facility in kWH, just multiply this by the factors in the eGrid tables. These are expressed in pounds of CO2 per megawatt-hour, so you will need to convert from pounds to metric tons and from kilowatt-hours (kWH) to megawatt-hours (MWH).  The average for the United States is approximately 1300 pounds of CO2 per megawatt-hour of electricity generation.

Why is the relevant to data centers? How many metric tons of CO2e does a data center emit anyway?  I did some cursory energy analysis to answer this question. Using the range of emission rates for U.S. electric generators, I modeled a data center located in Chicago, IL. I considered different size facilities : 2, 5, 10 and 20 MW.  The data in visualized in this graph:



The way this chart is read is the following:

1. On the X-axis, locate the data center capacity.

2. Draw a line straight up to the curve that best represents the CO2 emission output (in pounds per kWH) of the electrical generator, state or NERC region based on where the data center facility is located.

3. Draw a horizontal line to the Y-axis from where the vertical line intersects the appropriate curve.

4. Read the value on the Y-axis where the line intersects.



Table listing the different eGrid regions and the average GHG emissions from both base loaded and non-base loaded electrical power generation





Maps of the United States showing the different eGrid regions (left) and the average CO2 output per state in lbs per megawatt-hour (right)


So what does it all mean? Based on the analysis, data centers over 10 MW in capacity will likely face scrutiny in the future based on the indirect CO2 emissions. However, as the generation type gets more CO2 intensive, it becomes very difficult for facilities to achieve an upper limit threshold of 25,000 metric tons annually. Remember - I am making assumptions on the capping threshold and that corporations will need to report indirect emissions. But it is important to be prepared if this becomes the case.

CO2e is a metric that includes the other primary contributors to green house gas emissions. The formula that is used to determine CO2e  is:


CO2e =  (EF1* CO2) + (EF2 * NH4)+ (EF2  * N2O)


where EFi is the emission factor attributable to the particular gas. The US EPA publishes this information in their eGrid documents for all electrical power providers in the United States.

Visualization of Data - Comparing Hourly Climate Data with Psychrometrics

So here is another example of data visualization. Instead of table showing hourly dry-bulb and dew-point temperatures for different locations, plot them on a psychrometric chart. Remember there are 8760 hours in the year, so analyzing just one dataset (let alone several) with a table is certainly a daunting task.

The first image shows four different world-wide climate locations plotted on a standard psychrometric chart. (See the end of this post for information from Wikipedia on psychrometrics). You will notice the different groupings of data points for the different climates.

To make sense of these images, see the second graphic (accessible through hyperlink) which identifies which part of the psych chart corresponds to what type of climate. With these images alone you can draw the following conclusions about the predominant weather conditions for the locations:

  1. Chicago - cold and dry with large seasonal fluctuations in temperature and humidity

  2. Dublin, Ireland - mostly cool with moderate humidity levels

  3. Edmonton - cold and dry with seasonal fluctuations mostly in temperature

  4. Hong Kong - hot and humid with proportionate fluctuations in temperature and humidity

While certainly there are generalized depictions of the climates, they are accurate enough to get a good sense of the climate. These will lead to a logical next step of a more thorough analysis of a particular climate to see how data center energy use is impacted by temperature and humidity. 



Showing results for 
Search instead for 
Do you mean 
Follow Us

About the Author(s)
  • I’m a Global Strategist, a certified (PMI) Project Manager, specializing in business to IT alignment, agility consulting, Infrastructure Transformation and Strategic Architecture for Big Data, Mobility, Private Cloud, Unified Communications and Collaboration. I drive the strategy, vision and content of strategic consulting services in the Big Data IT Infrastructure services area at HP. As part of this, I meet with senior level customers to understand their challenges, conduct workshops to determine future vision and roadmaps as well as presenting at industry and analyst events.
  • Craig Partridge is the WW strategy lead for HP Technology Services Networking group. His role now covers strategy for consulting, professional and support services. The major areas of focus are Mobility, UC, Cloud Networking and IPv6. All aligned to core HP networking solutions - simplified, secure, optimized and available.
  • Don has held roles with the business and marketing of consulting for HP. Currently he supports HP's Client and Microsoft Solutions and the emerging Mobility Consulting services. He holds a MBA from UCLA's Anderson School.
  • Over 12 years of consulting, new technology services development and marketing experience covering data center, IT infrastructure, cloud technology domains. Hande holds a M.B.A degree from Bentley College, MA.
  • Having joined HP in 2003 Ian Jagger is the world-wide marketing and program manager for HP Technology Consulting's Strategic Consulting Services, Critical Facilities Services and Energy and Sustainability Management Services, as well as emerging IT services Prior to his current role, he served as the HP Services Marketing Manager for Central and Eastern Europe, Middle East and Africa, having joined HP in a similar role in the Middle East. Prior to HP Jagger had a 15 year international sales career, culminating in being Sales and Marketing Director for Steelcase Inc addressing Northern Europe before focusing more specifically on marketing. His initial focus was consultancy and interim marketing management, primarily for small to mid-sized customers based or looking to expand in the Middle Eastern region. Immediately prior to joining HP he was a strategic marketing consultant addressing investment targets for a technology fund. Born in Rochdale, United Kingdom, Jagger holds an honors bachelor of science degree in economics and a degree in social psychology from Loughborough University, England. He also holds a Masters Diploma in Marketing from the Chartered Institute of Marketing, is a Member of the Chartered Institute of Marketing and a Chartered Marketer. He has one daughter and lives in Cary, North Carolina.
  • Jeff Enters works in the HP Technology Services Networking organization and consults with customers on their IT strategies. He has over 20 years of consulting, design and integration experience in multi-vendor Voice and Data environments.
  • Jeff James is a Content Marketing Strategy Manager for HP Enterprise, focusing on HP Technology Services. He is the former Editorial Director for the Petri IT Knowledgebase, and also was the Editor-in-Chief of both Windows IT Pro and Microsoft Technet Magazine. Jeff also served as an editorial director at the LEGO Company, and has more than 20 years of experience as an award-winning technology writer, journalist, and digital content producer. Jeff started writing technology articles for AmigaWorld, Computer Gaming World, and other tech publications and websites in the early 1990's.
  • Editor and writer with 12+ years experience in the corporate software and technology sectors.
  • Jordan owns the strategic direction and evangelism of HP’s mobility and digital workplace services portfolio. He engages with customers around the world to discuss the tremendous potential of mobility for building more responsive, more nimble businesses and agencies by redefining end user productivity and innovation with IT solutions for the New Style of Business
  • Ken Larson has over 30 years of experience in Information Technology aligning business to technology. As an Enterprise Architect, he has delivered many successful architecture related services across business and government sectors in manufacturing, insurance, banking, oil, utilities, US state and federal governments. He is certified in TOGAF and IT Service Management.
  • Laura Cunningham is a CPA and business consultant with HP Technology Services Consulting. She helps CIOs and their teams bridge the gap between what the CIO wants and what the CFO requires by building a comprehensive business case that can withstand financial scrutiny.
  • I am on the WW Cloud and Big Data Solutions Team. I help our customers adopt HP advanced solutions that are made up of products and services from across HP. I have over 30 years experience in the technology business including 17 years of business ownership.
  • Strategist for large, complex and/or innovative solutions and projects. Broad mix of experiences developed in more than 20 years of technology-driven innovation. Fascinated by changes triggered by mix of behavior, needs and technology (now Cloud, Mobility, Big Data and more). Bachelor in Theoretical Physics. Certified HP Master ASE.
  • Working for EMEA TS Consulting, I am a Specialist in end to end management of customer data, from creation through consumption, to protection and preservation and ultimately (controlled) destruction. This includes, host, connectivity, storage, data protection and backup and archive, from a technical and more importantly, operational perspective. I have worked in the storage and data management industry for over 15 years, on both sides of the desk, as a customer and now as a consultant.
  • Patrick Lownds is a Senior Technology Consultant and is involved in designing and delivering both Client Virtualization and Cloud Computing solutions in the datacenter. Patrick co-authored “Mastering Hyper-V Deployment
  • I’m a Global Chief Engineer certified Exchange Architect and Master (MCA and MCM), specializing in Messaging, Mobility, Private Cloud, Unified Communications and Collaboration. This relates to all work to coordinate sales, pursuit and delivery readiness in all services that relate to HP's portfolio around Microsoft Exchange. Includes internal readiness as well as external events, analyst briefings. With 25+ year’s experience in the industry. Thomas has been involved with Microsoft products since 1993. Specialties: - Architecting complex public and private Cloud solutions for Exchange, SharePoint and Lync in standard, dedicated or hybrid scenarios. - Lead the HP specialists team unit to win and acheive our business targets and budget - Drive HP's Exchange Services for Private Cloud - Working with partners, vendors and internal teams to align, expand and grow HP's strategy.
  • Tim Swiader has twenty plus years in the Information Technology industry. He has worked primarily with the fortune 100 and legacy carriers transforming their applications, networks and data center facilities.
  • Tari is a Distinguished Technologist with 30 years of IT and cyber security experience. He is dual board certified in information security/business continuity and is responsible for a wide range of management and technology consulting services encompassing information security, disaster recovery, privacy, and risk management. His problem-solving skills, knowledge of various technology platforms, compliance statutes, industries, as well as his experience in deploying defense-in-depth and InfoSec Program solution architectures is commonly applied when advising CIOs/CISOs as well as leveraged in numerous HP client engagements throughout the world. Tari has designed, built, and managed some of the world’s largest InfoSec programs allowing them to defend against even the most aggressive attackers.
  • I provide technical consulting services at all phases including analysis, planning, design and implementation. I have a wide range of experience in WAN and LAN technologies, as well as providing security solutions and deploying operating system infrastructure. Besides working directly with clients to deploy technology in their data centers, I also find myself architecting or discussing solutions with a business’s chief information officer, helping to lay out a roadmap for the coming years.
  • Bill Kosik, PE, CEM, BEMP, LEED AP BD+C, is a Distinguished Technologist at HP Data Center Facilities Consulting (DCFC). He is responsible for execution of project work, training and mentoring of internal engineering and consulting teams, research and analysis of topics related to data center energy use, and industry presentations and writing assignments. He is one of the main technical contributors who have shaped the DCFC technical expertise in energy use optimization and sustainable design practices in data centers. He is a member of the Consulting-Specifying Engineer Editorial Advisory Board.
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.