The data center market has expanded dramatically in the past few years, and it doesn’t show signs of slowing down. Many clients and building owners are requesting modular data centers, which can be placed anywhere data capacity is needed. Modular data centers can help cash-strapped building owners add a new data center (or more capacity) to their site, and can assist facilities with unplanned outages, such as disruptions due to storms. Owners look to modular data centers to accelerate the “floor ready” date as compared to a traditional brick and mortar.
3. Learning objectives
•Learn the differences between the various types
of modular data centers.
•Know about the benefits—and negative
aspects—of specifying modular data centers.
•Understand the cooling requirements and issues
associated with modular data centers, including
compliance with ASHRAE Standard 90.1 and the
International Energy Conservation Code (IECC).
•Understand the unique power/electrical
requirements and issues associated with modular
data centers.
4. Definitions
MDF: main distribution frame
IDF: intermediate distribution frame
ISO: International Organization for
Standardization
PUE: power usage effectiveness
UPS: uninterruptible power supply
#CSEmodulardatacenter
5. Presenters:
Bill Kosik, PE, CEM, BEMP, LEED AP BD+C
HP Critical Facilities
Data Center Energy Technologist
Brian Rener, PE, LEED AP
M+W U.S. Inc. - A Company of the M+W Group
Electrical Engineering Discipline
Platform Leader
Quality Assurance Manager
Amara Rozgus
CFE Media
Consulting-Specifying Engineer and
Pure Power
Editor in Chief/Content Manager
7. Aiming for the data center of the future
•Efficient and effective
•Self-regulating
•Standardized processes
•Fully available and resilient
•Monitoring & control
•Fully service-oriented
•Green
•Business-centric •Integrated
•Modular and elastic
•Shared resource pools
•Policy-based
•Fully automated
•Right sourcing
8. (1) IDC Directions 2013: Why the Datacenter of the Future Will Leverage a Converged Infrastructure, March 2013, Matt Eastwood ; (2) & (3) IDC Predictions 2012: Competing for 2020, Document
231720, December 2011, Frank Gens; (4) http://en.wikipedia.org
The growing Internet of Things (IoT)
Pervasive
connectivity
Explosion of
information
Today
400,710 ad
requests
2000 lyrics played
on Tunewiki
1,500 pings
sent on PingMe
208,333 minutes
Angry Birds played
23,148 apps
downloaded
416,340 tweets
Smart
device
expansion
60
sec
2013
30
Billion
By 2020
40
Trillion GB
… for
8
Billion
10
Million
DATA
(1)
(2)
(3)
Devices
Mobile
Apps
(4)
The IoT is a world where nearly everything is connected to a data center, items like cars, home appliances, glasses, watches, jewelry, clothes….
even packaged goods. Every one of these devices will, one way or another, be connected to a data center for control, management, and analysis.
The required data center capacity cannot be served effectively with current data center and server architectures.
9. Software Defined Server
45 hot-plug cartridges
Compute, Storage, or
Combination
• Single-server = 45 servers per
chassis
• Quad-server =180 servers per
chassis
Approximate average 55 W per cartridge (20 W min, 90 W max)
45 servers per chassis = 450 servers per rack = 24 kW per rack
180 servers per chassis = 1800 servers per rack = 97 kW per rack
Data centers can put up
to 1800 servers in a
single, 47U rack, which
could take 10X as many
racks using a standard
architecture. This
extreme density
reduces, per a given
unit of work, the
datacenter size, energy
consumption,
complexity and cost.
10. Modular Data Center Basics
• Two basic types of pre-manufactured spaces
– ISO containers
– Non ISO containers
– Modular rooms
11. Modular Data Center Basics
Containerized
– Lower Capex, scalable, relocatable
– ISO
• Conform to ISO standards for size. 10-, 20-, 40-, and 53-ft
lengths standard, 9.5-ft width typically
• Usually built to UL standards and not occupied
• Up to 19 conventional IT racks
• 3 kW to 40 kW per rack and higher
– Non ISO
• Can be any size
• Maybe built to IFB/IFC codes
• Can be occupied.
12. Modular Data Center Basics
Modular rooms
– Prefabricated rooms, assembled on-site
– Expandable construction
– Rapid deployment over stick built
– General same features as conventional data centers.
13. Modular Data Center Basics
Containerized
– Many types and configurations
• All in ones
• IT/data only
• MDF/IDFs
• Power gear and UPSs
• Cooling modules
– Can be located outside or inside a structure
– Code officials often unfamiliar
– Rapid and scalable deployment environments.
14. Server Power Use Efficiency
Based on testing data, the average power has been steady with an increase in 2011. The idle power
as a percentage of full power has been trending downward over the testing period of 2007-2013.
15. Server Power and Inlet Temperature
In general, server power demand increases commensurate with inlet temperature
This graph shows server airflow and power requirements based on inlet temperature and
workload percent. Notice that the power and airflow both increase as the inlet temperature
increases above 28 C, even at an idle workload.
16. Increasing Temperature to Reduce Energy Use
Conclusion: Using hotter inlet
temperatures works well in hot climates
and when using economizer. In cold
climates, there is relatively little
difference because economization
using colder temperatures is available
most of the year.
17. PUE Varies Based on Climate
0.35 difference in PUE based
on climate and cooling system
type
18. Modular Data Center Cooling Technologies
Direct OA /
Evaporative
Consists of a supply
fan, filters, direct
evaporative media
and direct expansion
cooling assembly. Is
most efficient in cold
to moderate
temperature
environments with low
to moderate humidity
levels.
Indirect Evaporative
Consists of a supply
fan, filters, indirect
evaporative media and
direct expansion cooling
assembly. Provides
separation between
environments with high
levels of air pollution due
to 100% recirculation
allowing the unit to a run
a closed air circuit.
Heat Wheel
Consists of multiple
supply and exhaust
fans, filters, heat
transfer wheel and
direct expansion
cooling assembly.
Provides isolation of
the outdoor air
streams where direct
use in the data
center is not
possible.
Air-to-Air HX/Heat
Pipe
Consists of multiple
supply and exhaust
fans, filters, heat
transfer wheel and
direct expansion cooling
assembly. Provides
isolation of the outdoor
air streams where direct
use in the data center is
not possible.
19. Cooling Modularity and Energy Efficiency
• Multiple cooling methods
– Adaptable to different climate zones
– Tuned to local environment
– Provide highest efficiency for a
particular location
– Use external cooling in most climates,
reducing power and water
consumption
• External containers
– Easier installation, maintenance,
upgrade
– Protection for critical IT equipment
– Add containers to conform to
increased IT loads
27. Summary of Optimization Levels
• Extreme regional variations in CO2 from electricity generation
• Determine appropriate balance of water and electricity usage
• Climate WILL impact HVAC energy use – select sites carefully
• Use evaporative cooling where appropriate
• Economizer strategy will be driven from climate characteristics
• Design power and cooling modularity to match IT growth
• Plan for power-aware computing equipment
• Use aisle containment or direct-cooled cabinets
• Design in ability to monitor and optimize PUE in real time
• Push for highest supply temperatures and lowest moisture levels
• Identify tipping point of server fan energy/inlet temperature
• Minimize data center footprint by using high-density architecture
DataCenterClimateSynergiesConvergence
28. Modular Data Centers - Electrical
Containerized power criteria
– Typical maximum IT load is around 1 MW
– Voltage levels 120/208, 400, 480, 600 Volts
– Multiple power sources.
29. Modular Data Centers - Electrical
Tier ratings
– Available from Tier I to Tier IV
30. Modular Data Centers - Electrical
Container examples
– Combined DC module
– Power modules
37. Bill Kosik, PE, CEM, BEMP, LEED AP BD+C
HP Critical Facilities
Data Center Energy Technologist
wjk@hp.com
Brian Rener, PE, LEED AP
M+W U.S. Inc. - A Company of the M+W Group
Electrical Engineering Discipline
Platform Leader
Quality Assurance Manager
Brian.Rener@mwgroup.net
Thank You!
38. Presenters:
Bill Kosik, PE, CEM, BEMP, LEED AP BD+C
HP Critical Facilities
Data Center Energy Technologist
Brian Rener, PE, LEED AP
M+W U.S. Inc. - A Company of the M+W Group
Electrical Engineering Discipline
Platform Leader
Quality Assurance Manager
Amara Rozgus
CFE Media
Consulting-Specifying Engineer and Pure
Power
Editor in Chief/Content Manager