COOLING
Most data centre operators, and server manufacturers have
been set up for this and know how to deal with this much
heat. But CPUs are about to get hotter, a lot hotter.
Heat loads on next generation of CPUs and GPUs are
heading beyond the 200W per chip up to 400W and even
500W – the natural corollary of their increased processing
power. Using air cooling as at present will cause density to
decrease, completely destroying performance and power
utilisation efficiencies – it just doesn’t work.
Regulation is in the works that will make it increasingly
difficult to run an inefficient or wasteful data centre – this
will make refreshes more frequent and will, despite the hotter
CPUs, demand greater density. This change will be imposed
on the industry. More positively though, to take advantage
of the opportunities presented by these changes, speed and
proximity will become the key to winning – we’ll have to
change, but there are great opportunities for competitive
advantage in embracing that change.
High quality and readily available bandwidth and cheap
sensors have combined to create the IoT – billions of
connected devices processing billions of pieces of information
every second. Although the phrase is somewhat hackneyed,
edge computing and its impact continue to grow. Although
you should do all you can there, not everything can be done
in the centralised cloud for various reasons: criticality, latency
and data sovereignty/security being chief among them.
It’s clear that we’re going to be requiring ever bigger
and more numerous data centres for some time to come.
From an environmental standpoint, data centres consume
a staggering percentage of the world’s electricity with
experts predicting that by 2025, ICT will account for 20%
of the world’s electricity usage and contribute more than
3.5% of global carbon emissions – more than aviation and
shipping. When you consider that cooling currently accounts
for around 40% of data centre energy usage, it becomes
apparent that for technology to continue to keep pace with
our demands without damaging the environment, we’re
going to have to seek alternative solutions.
Electricity use and carbon emissions aren’t the only
resource issue facing the data centre: they’re also
consuming massive amounts of water, with a typical data
centre getting through the equivalent of an Olympic sized
pool every day. And water usage isn’t simply an economic
or environmental issue, it’s also a political one. Especially
in places with water supply challenges, such as California
– home to the world’s biggest tech companies and their
hyperscale data centres.
Time for liquid
So, what’s the solution? Whether apocryphal or not,
industrialist Henry Ford is credited with having once said, “If
I had asked people what they wanted, they would have said
faster horses.” Today, those with a vested interest in the status
Power and location
quo would have you believe that what you need is colder air.
Over the next 20 years or so, it’s predicted much of the
What’s really needed is new cooling methodologies and
world’s economic growth will come from Africa with less
formats. So, how about liquid cooling? After all, liquids have
developed parts of Asia and South America also likely
thousands of times the heat transfer capacity of air and some
to go from strength to strength. For these places to truly
industries have been taking advantage of this for years.
participate in the global economy and take advantage of the
Before the data centre industry takes the plunge and
opportunities presented by technological progress, they’re
embraces liquid cooling, liquid cooling needs to be able to
going to need data centres, and these consume lots of power. achieve certain things: it needs be able to reduce capital
Generally speaking, these locations are hot and experience expenditure through reduced complexity, be available in
power and water poverty, which makes building data centres familiar and data centre ready form factors, be safe and
with traditional cooling techniques challenging. Every kilowatt free from risks of leaks or fires, have a fast ROI and low TCO,
of power used on cooling is not being used to provide more
simple to integrate with existing infrastructure, be easy to
compute power.
deploy and use as little water as possible. n
Challenging the Edge:
The “Data Centre in a Box” concept enables
equipment to be deployed in non-traditional
Data Centre environments.
IT INFRASTRUCTURE
TS-IT rack platform
Demand-orientated climate control
System monitoring
Intelligent power rails
SOFTWARE & SERVICES
www.rittal.co.uk
35