cooling
Inside the building, existing services,
cabling or pipework can cause airflow to
be less than optimal and therefore the
capacity and efficiency of the cooling
system is reduced.
The pitfalls of current
cooling strategies
The increase in power costs and the
recent Climate Change Agreement for
data centres is leading operators to
reduce their overall site power usage
effectiveness (PUE), and this is pushing
operators towards certain types of
solutions – usually adiabatic systems.
While PUE is the current measure
of choice, it’s not perfect and it may
well be replaced in the future by a
different metric, one that may take
more account of water usage or the
effects of higher supply temperatures.
Some companies have also rushed
into solutions that are based purely on
electrical running costs, to then discover
that it is lacking in other technical areas,
such as environmental control.
Some data centres are now using
pure fresh air systems to cool data
centre space. While this may be very
simple and cheap to run, this brings
many complications when it comes
to air contamination from outside,
fire protection and humidity control.
Data centres have been taken offline
due to external fires closing the
cooling systems down, for example.
Of course, having a back up direct
expansion (DX) or chilled water
system mitigates the risk, but it does
complicate the solution.
The current trend is towards
indirect adiabatic coolers, which
seem to provide a good compromise
on running costs vs. environmental
control, yet the sheer physical size of
these units does not make them the
solution for every problem. There may
always be a use for more traditional
chilled water systems – or DX
systems – even on new builds.
What’s the right approach?
It’s clear that when it comes to
choosing and deploying data centre
cooling solutions, there’s a number
of issues at play. Operators need
to decide what is most important
and where they are comfortable
making a compromise, if necessary.
One vital element to take into
account is current rack densities and
ascertaining a best guess projection
of what will be required in the future.
A big challenge for colocation
providers in particular is the unknown
load distribution from customers,
who are deploying higher and higher
rack densities. It is important to
know what a solution can and can’t
cope with, therefore having plenty
of capacity in the system will help
cope with such scenarios. The quality
of the data centre environment and
its maintainability should also be
acknowledged as more significant
than low running costs.
Once a cooling system is
deployed, the environment needs
to be managed to ensure the most
effective performance. That means an
appropriate amount of airflow around
the racks, good aisle containment,
tidy cabling, blanking panels and
a clear floor void, if used for air
delivery. Running at <50 per cent
in normal operation puts less strain
on components and reduces faults,
therefore the data centre environment
is more stable during maintenance or
other events.
Data centres are technologically
complex and it is critical now
more than ever to ensure the
right operating environment. If
the temperature inside the data
centre rises to excessive levels,
Cooling at the Node4
Northampton facility
Earlier this year, Node4 invested £2m to expand its
Northampton data centre, and the upgrade saw the
facility built around an innovative cooling system.
When choosing its cooling solution we started by
selecting the type of cooling technology to deploy.
Selecting the right solution meant evaluating several
critical factors: the quality of components and
controls, the support/maintenance provided and
upfront running costs.
Node4 puts in a lot of cooling capacity in its
data centres (N+N on its main sites) to cope with
maintenance events or faults, but also to ensure that
equipment runs efficiently. At the Northampton data
centre, Node4 has a number of systems in place,
reflecting the best solution available at the time of install.
The facility has a free cooling chilled water system,
which uses traditional CRAC units and an even more
efficient ‘cool wall’ system by Rittal/Weiss, consisting of
giant cooling coils and decoupled fans. Node4 also uses
cold aisle containment in its data halls, however in its new
400 rack hall it uses hot aisle containment and an indirect
adiabatic cooling system based on Emerson EFCs.
The solutions are as efficient as they can be without
compromising on the quality of the environment. As such,
the cool wall system gives typical PUEs of 1.3, while the
EFC system will be as low as 1.1. This will still deliver air
in the low 20s using a DX boost on the hottest days.
considerable damage could be
caused. By understanding customer
requirements, future demands
and the physical data centre itself,
operators can create the optimal
temperature environment that
balances cooling with costs, efficiency
and control.
23