Unit 2
Chapter 4: Cooling
Index
• Cooling Costs
– Power Cost
– Causes of Cost
– Calculating Cooling Needs
• Reducing Cooling Costs
– Economizers
– On-Demand Cooling
– HP’s Solution
• Optimizing Airflow
– Hot Aisle/Cold Aisle
– Raised Floors
– Cable Management
– Vapour Seal
– Prevent Recirculation of Equipment Exhaust
– Supply Air Directly to Heat Sources
– Fans
– Humidity
• Adding Cooling
– Fluid Considerations
– System Design
• Datacentre Design
– Centralized Control
– Design for Your Needs
– Put Everything Together
Cooling
• With any amount of power comes heat, and if
there’s too much heat in the datacenter, you
can expect trouble.
• Cooling Costs:
– Some estimates state that cooling can account for
upward of 63% of your IT department’s power
usage.
– Figure out how much actually spending and how
much actually need to spend.
– How much power costs and how those costs are
computed.
– Electricity is paid for per kilowatt-hour (kWh).
– This is a measure of the hourly consumption of
electrical power.
– A 100-watt (W) bulb uses 100 watt-hours of electricity
in 60 minutes.
– As such, ten 100 W light bulbs will use a total of 1
kWh of electricity per hour.
– But electrical power costs are different around the
country
– International Data Corp. estimated that companies
worldwide spent about $29 billion to cool datacenters
in 2007, up 400% from 2000—IDC, 2006.
• Causes of Cost:
– Cooling is a major component of power consumption
and IT budget.
– Typically this number is expressed in British Thermal
Units (BTUs) or kW.
– One kilowatt is the same as 3412 BTUs.
– Issues driving up power consumption and cooling
costs, includes the following:
• Increased power consumption as more servers and storage
devices are deployed.
• Increased heat density in the racks because of increased
computing power in a confined space.
• Irregular heat load in the datacenter. This is
exacerbated by poor planning for heat management as
the topology of the datacenter changes.
• Increasing power costs.
• A tendency to overcool datacenters. The “flood-cooling
impulse” leads datacenter managers to overcool their
datacenters by more than two and a half times what is
needed.
• Calculating Your Cooling Needs:
– All the equipment in your server room generates
heat.
– Also the lighting and the people working there.
– All these sources of heat contribute to the heat
load of the server room.
– In order for your air conditioner to cool a room, its
output must be greater than the heat load.
– To determine the heat load consider following:
1. Room Size:
• To calculate the cooling needs of the room, use this
formula:
• Room Area BTU = Length (meters(m)) × Width (m) × 337
2. Windows:
• Most often, server rooms have no windows.
• If you have windows, look at these formulas to
determine which is most applicable to your datacenter:
• South Window BTU = South Facing Window Length (m) × Width (m)
× 870
• North Window BTU = North Facing Window Length (m) × Width (m)
× 165
• If there are no blinds on the windows,
multiply the results by 1.5.
• Windows BTU = South Window(s) BTU + North Window(s) BTU
3. People in the Room:
• Total Occupant BTU = Number of occupants × 400
4. Equipment:
• Find the equipment’s power consumption in its
documentation or on the vendor websites.
• Equipment BTU = Total wattage for all equipment × 3.5
5. Lighting:
• Lighting BTU = Total wattage for all lighting × 4.25
6. Total Cooling Requirement:
• Total Heat Load = Room Area BTU + Windows
BTU + Total Occupant BTU + Equipment BTU +
Lighting BTU
Reducing Cooling Costs
• It is wiser to deploy equipment that won’t
chow down a lot of power.
• Equipment to use that can save money and
help supplement environment.
• Economizers:
– Winter provides an opportunity to enhance
cooling system by using the cold outside air to
cool things down.
– Employ an economizer for this.
– There are two types:
1. air-side economizers and
2. waterside economizers.
• Air:
– An air-side economizer regulates the use of outside air
for cooling a room or a building.
– It employs sensors, ducts, and dampers to regulate
the amount of cool air brought in.
– The sensors measure air temperature both inside and
outside the building.
– If it notices that the outside air is suitably cold enough
to cool the datacenter, it will adjust its dampers to
draw in the outside air, making it the main source of
cooling.
Damper
Duct
– This cuts or eliminates the need for the air
conditioning system’s compressors, which
provides a big cost savings.
– Because the economizers are drawing air in from
outside, pollution can potentially enter the
datacenter.
– A larger concern is the change of humidity in the
datacenter.
– What you spend on filtration and humidification
might be more than if you just used your regular
air-conditioning system.
• Fluid:
– A water-side economizer utilizes evaporative
cooling (usually provided by cooling towers) to
indirectly produce chilled water to cool a
datacenter when outdoor conditions are cool
(often at night).
– This is best for environments with temperatures
below 55 degrees Fahrenheit for 3000 or more
hours a year.
Water-side economizer
– Using economizers, chilled-water-plant energy
consumption can be cut by up to 75%.
– You will also see reductions in maintenance costs,
because the fluid-chilled cooling system allows
you to drastically reduce—maybe even completely
eliminate—the need for chiller operation.
– They not only save costs, but they don’t allow
contaminants or altered humidity levels into the
datacenter.
– Water-side economizers work with a cooling tower,
evaporative cooler, or dry cooler to cool down the
datacenter.
– This type of economizer is normally incorporated into
a chilled water or glycol-based cooling system.
– Fluid in the cooling system passes through a coil to
cool the room, thus eliminating the need for the
compressor to operate.
On-Demand Cooling
• These units are brought in to provide temporary
cooling when central air is down.
• There are two types of on-demand cooling systems,
very similar in function to economizers:
• Air to air :
– Smaller air-to-air coolers can be wheeled into the room
needing cooling.
– They use flexible ductwork to connect to a window, and
then the generated heat is transferred out of the building.
– They can be plugged into a standard 110-volt wall outlet.
– Larger units can be mounted on the outside of the
building, with cool air being ducted through a
window.
– These units operate on temporary 208-to-230-volt
circuits.
• Water based :
– These are much larger units, where a standard
garden hose is connected to the device so that
water flows in, cools down the equipment, and
then is sent through a second hose to run down a
drain.
HP’s Solution
• Hewlett-Packard offers a cooling technology
that it says can cut an IT department’s power
costs by upto 40%.
• The system, called Dynamic Smart Cooling,
uses sensors to control the temperature in
specific areas of the datacenter.
• HP labs were able to reduce the power to cool
a datacenter from 45.8 kW using a standard
industry setup to 13.5 kW
• Dynamic Smart Cooling is an intelligent
solution, and rather than turning your
datacenter into a meat locker, the system
allows air conditioners—managed by specially
designed software—to regulate the cold air
delivered to a room based on the needs of
specific computers.
• Dynamic Smart Cooling uses the datacenter’s
air conditioning system to adapt to changing
workloads with sensors attached to the
computers.
• If the system senses that a computer is
warming up too much, air conditioners will
send more cool air.
Optimizing Airflow
• To deliver the precise cooling environment, air
must be exchanged at a sufficient rate.
• Normal office environments must change air
over twice an hour.
• In high-density datacenters, air has to be
exchanged 50 times an hour.
• If enough air is not exchanged, cooling air will
heat up before it reaches the equipment, and
disaster could occur.
• Good practices can help minimize your costs
without you having to buy the newest
product.
• Some best practices that can help optimize
the airflow around your servers and other
networking equipments are:
• Hot Aisle/Cold Aisle:
– Equipment is typically designed to draw in air from
the front and then blow the exhaust out the rear.
– The cool sides of equipment are arranged
together, whereas the hot sides of equipment face
each other.
– This allows the equipment to draw in cool air,
rather than air that has already been preheated by
the rack of equipment in front of it.
– The cold aisles have perforated floor tiles to draw
cooler air from the raised floor.
– Floor mounted cooling is placed at the end of hot
aisles, but not parallel to the row of racks.
– This is because parallel placement can cause the
hot exhaust to be drawn across the top of the
racks and mixed with the cool air.
– It also decreases overall energy efficiency.
• Raised Floors:
– Datacenters are conventionally built on a floor
that is raised 18 to 36 inches.
– The higher the floor level, the more air that can be
distributed under the floor and the more air that
can be used by the cooling system.
– But higher isn’t always practical.
– There can be major disruptions to day-to-day
operations.
– Plus, the higher up you build the floor, obviously,
the closer you’ll be getting to the ceiling.
– This can be a hindrance not only for rack sizes, but
also for the flow of air over the top of equipment.
• Cable Management:
– Developing a good cable management system in
conjunction with the hot-aisle/cold-aisle design
can equate to more energy efficiency.
– Whenever possible, it’s best to route your cables
under the hot aisle, as shown in Figure 4-5.
– This reduces the cool air’s path to the equipment
as it is drawn in through the perforated tiles and
into the equipment's cooling systems.
– Some racks now provide expansion channels that
help with cable management and ease heat
removal for high-density racks.
– Some organizations are also running cabling above
or through racks, rather than under the floors, to
reduce the interference with the flow of air from
below.
– Some are deploying advanced power strips to
bring the power load closer to the rack rather
than running so many cables through the
datacenter.
Power Strips
• Vapor Seal:
– It’s also important to ensure you have a good
vapor barrier in your datacenter, cutting it off
from the rest of the building.
– If you have a poor vapor barrier, humidity will
move into the datacenter during hot months and
escape during the winter months.
– A good vapor seal reduces the costs to humidify or
dehumidify.
6-mil (0.15 mm) polyethylene plastic sheet as vapour barrier
between insulation and gypsum board
• https://youtu.be/fSdD9r5K4RU
Prevent Recirculation of Equipment
Exhaust
• Your networking gear can get hot enough on its own and doesn’t
need help from its neighbors—nor does it need to heat up its
neighbors.
• The following are some simple steps you can employ in your
datacenter to prevent exhaust from being reabsorbed by other
devices.
1. Hot-aisle/cool aisle: Employ the hot-aisle/cool-aisle design
mentioned earlier.
2. Rigid enclosures: Build rigid enclosures to keep exhaust heat
from being sucked back into the device’s cool air intakes.
3. Flexible strip curtains: Use flexible strip curtains to block the
open air above your racks that have been configured into a hot-
aisle/cool-aisle layout.
4. Block unused rack locations with blanks:
Equipment typically draws in cool air from the
front and exhausts it out the back. Blanking
open areas under equipment prevents the
exhaust from being drawn back into the device.
5. Design with cooling in mind : Although most
do, some equipment does not draw air in from
the front and exhaust it out the back. Some
have top-discharge or side to- side designs.
Configure your racks to ensure your equipment
doesn’t blow into the intake of other
equipment.
6. Select racks with good airflow: Buy racks that
don’t have an internal structure that would
block the smooth flow of air to your
equipment.
Supply Air Directly to Heat Sources
• Rather than shelling out the money to cool
the entire datacenter, save some money and
just cool down the devices generating heat.
• These tips can help:
1. Use the correct diffusers: The type of
diffuser you would use in an office is not
appropriate for a datacenter. Select diffusers
that deliver air directly to the equipment that
needs cooling.
Diffusers
2. Correctly place supply and returns: Diffusers
should be placed right by the equipment to be
cooled. They should not be placed so they direct
cooling air at heat exhausts, but rather into the
air intakes. Supplies and slotted floor tiles
should not be placed near returns to prevent a
cool air “short circuit.”
3. Minimize air leaks: Systems that use a raised
floor can lose cool air through cable accesses in
hot aisles.
4. Optimize air conditioner placement: In large
datacenters, a computational fluid dynamics (CFD)
model would be useful. This helps locate the best
placement for cooling units. It also helps minimize the
distance between air conditioner units and large
loads.
5. Use properly sized plenums: Return plenums need to
be the right size to allow a lot of air to flow through.
Obstructions such as piping, cabling trays, and
electrical conduits need to be taken into
consideration when plenum space is calculated.
6. Provide enough supply: Under-floor supply
plenums must be big enough to allow
enough air to service your equipment. Again,
take into consideration obstacles such as
piping, cabling trays, and electrical conduits.
Fans
• Fans also suck up a lot of power, especially
when a lot of them are spinning at the same
time.
• Take these tips into consideration to improve
fan efficiency:
1. Use a low-pressure drop system : Use low-
pressure drop air handlers and ductwork.
Make sure there is enough capacity in your
under-floor plenums to allow air to flow.
2. Use redundant air handlers during normal
operations: It is more efficient to use
auxiliary fans at a lower speed than a single
fan at high speed. Power usage drops with
the square of the velocity. As such, operating
two fans at 50% capacity uses less power
than one fan at full capacity.
Humidity
• Datacenter cooling systems must also be able to
adapt to exterior temperature and humidity.
• Because these factors will change depending on
where on the globe the datacenter is located—
along with the time of year—datacenter air-
conditioning systems must be able to adapt to
these sorts of changes.
• Too much humidity or too little humidity can
wreck your datacenter equipment.
• Use these tips to help keep your datacenter at
the right level:
1. Establish a humidity sensor calibration schedule:
Humidity sensors drift and require frequent
calibration—more so than temperature sensors. Also,
incorrect humidity sensors are less likely to be noticed
than incorrect temperature sensors. As such, establish
a frequent test and calibration schedule for your
humidity sensors.
2. Allow for sensor redundancy: Make sure you have
enough sensors to keep an eye on your datacenter’s
humidity level. To ensure a tight control, multiple
sensors should be used. At the very least use two, but
more are better.
WIFI Humidity and temperature sensor
Humidity sensor for HVAC in room, wall
Or duct mount enclosures
3. Manage humidity with a dedicated unit: If ventilated
air is used (maybe from an air-side economizer),
control humidity with a single ventilation air handler.
4. Lock out economizers when necessary: When using
an air-side economizer, minimize the amount of air
that’s brought in when the dew point is low. This
saves money on having to humidify the dry air.
5. Centralize humidity control : Each datacenter should
have its own centralized humidity control system.
Multiple systems wind up fighting each other, and the
system becomes less efficient.
Gray Air Handling Unit for the central ventilation system on the roof of
the mall
Adding Cooling
• If your datacenter is especially “equipment
dense,” you’ll need to add some extra cooling
capacity.
• The best way to cool your equipment is to make
sure the cooling gear is as close as possible to the
heat sources.
• When you decide how to supplement your
cooling systems, you should consider what type
of system to use (air or fluid based) and what
type of design the system will use.
Fluid Considerations
• As anyone with a car knows, fluid is a great way to move
heat from equipment (in this case, the engine) to keep it
cool.
• Fluid-based cooling systems have to be used with care.
• Water isn’t the only fluid used for cooling.
• Though water is normally used in floor-mounted cooling,
because of safety concerns, R134 a refrigerant is typically
used when cooling is used closer to the equipment.
• This is because refrigerant turns into a gas when it reaches
the air, so leakage doesn’t pose a threat to your equipment.
• Table 4-3 lists the advantages and disadvantages of both
solutions.
• However, it isn’t just safety and effectiveness
that makes refrigerant a good match for
cooling needs.
• Fluid solutions employ micro channel coils for
better efficiency, and a low-pressure system
results in lower operating costs.
• It can also provide an energy efficiency savings
of between 25 and 35% based on kilowatts of
cooling capacity per kW of heat load.
System Design
• Because getting close to the heat source is so
important, the cooling system’s design is important to
consider.
• There are two common designs in datacenters—open
and closed.
• In a closed design, the electronics and cooling
equipment are situated together in a sealed
environment.
• The benefit of this is that it is a high-capacity cooling
solution.
• The downside is that the design isn’t as flexible, nor
fault-tolerant.
• In a datacenter environment, however, an open design is
preferred, because a closed solution offers little flexibility.
• For example, if a cooling system fails, the racks are isolated
from the room’s own cooling opportunities.
• Inside the enclosure, the server can reach its over-
temperature limit in 15 seconds.
• With an open architecture, modules can be positioned
close to the racks, but are not enclosed, so room air can be
a sort of backup if the cooling equipment fails.
• This makes it much safer for both your organization’s data
reliability as well as the hardware’s physical health.
• Not least of all, you have much greater flexibility to
configure and reconfigure your datacenter as the system
evolves.
Datacenter Design
• You can optimize your cooling needs by how
you design your datacenter.
• A number of issues can help you reduce the
amount of cooling you need, simply by how
you design your datacenter and how cooling is
deployed.
Centralized Control
• When designing your cooling plan, it’s best to employ a
custom centralized air-handling system.
• This sort of system offers several benefits over the
prevalent multiple-distributed unit system, including
the following:
– Better efficiency.
– Can use surplus and redundant capacity.
– Units can work in conjunction with each other, rather than
fighting against one another.
– Uses fluid-cooled chiller plants, which are much more
efficient than water- and air-cooled datacenters.
– Less maintenance is required.
Design for Your Needs
• Unfortunately, our datacenters’ power needs
rarely get the exact fit they need.
• They are usually loaded too light.
• Although a certain amount of the dark arts are
involved in getting the size right, it is important to
get as close as you can with electrical and
mechanical systems so that they still operate
properly when under loaded, but are still scalable
for larger loads.
• You can come close to this Zen-like balance if you
consider a few issues:
1. Upsize the duct, plenum, and piping
infrastructure. This reduces operating costs
and allows a measure of future-proofing.
2. Use variable-speed motor drives on chillers,
chilled and condenser water pumps. Also,
use cooling tower fans to help with part-load
performance. This can be especially helpful
when controlled as part of a coordinated
cooling system.
3. Examine efficient design techniques, such as
medium-temperature cooling loops and fluid-
side economizers.
4. Cooling-tower energy use is typically a small
portion of energy consumption. If you upsize
cooling towers, you can improve chiller
performance and fluid-side economizers.
Although this involves a larger cost up front and
a larger physical footprint, you’ll find savings in
operational costs.
Put Everything Together
• Efficient cooling isn’t just a matter of installing
intelligent equipment.
• Organization-wide considerations must be
implemented, including design and decision-
making issues.
• Such issues include:
– Use life cycle cost analysis as part of your decision-
making process.
– Involve all key stakeholders to keep the team together
on the project. Document and clarify the reasons for
key design decisions.
– Set quantifiable goals based on best
practices.
– Introduce energy optimization as early as
possible in the design phase to keep the
project focused and to keep costs
minimized.
– Include integrated monitoring, measuring,
and controls in facility design.
– Examine and benchmark existing facilities and
then track your performance. Look back over
the data and look for any opportunities to
improve performance.
– Evaluate the potential for onsite power
generation.
– Make sure all members of the facility-
operations staff get site-specific training,
including the identification and proper
operation of energy-efficiency features.
• As anyone who has tried to string network cabling through an old
building knows, planning for the future can save a lot of time,
energy, and money.
• The same philosophy is true of cooling systems—it’s a good idea to
plan for the future.
• Selecting a technology that can scale to future needs is a critical
part of your considerations.
• Because if you do add more power to your system, you’ll need to
add more cooling.
• Most server manufacturers are working on solutions that bring
refrigerant-based cooling modulated into the rack to manage heat
densities of 30 kW and higher.
• This will make refrigerant-based systems compatible with the next
generation of cooling strategies.
• Your cooling system is such a huge portion of
your datacenter that it really merits a lot of your
attention—not just to reduce your electricity bill,
but also to mitigate your carbon emissions.
• Spending time and effort to ensure you have a
well-running cooling system will help not only
your organization but also the environment.
• It isn’t just your machinery that can help reduce
your impact on the environment.
End