Keeping your cool

The Middle East is one of the more difficult countries to site a data centre, with issues such as extreme heat making building and maintaining a data centre particularly challenging.

Tags: Cannon Technologies ( Products International ( IncorporationIceotope (
  • E-Mail
Keeping your cool Sundeep Raina, regional sales manager, Chatsworth Products International.
By  Georgina Enzer Published  February 4, 2013

The Middle East is one of the more difficult countries to site a data centre, with issues such as extreme heat making building and maintaining a data centre particularly challenging. To understand what data centre owners are doing to mitigate these problems, regional cooling experts share their insights.

Temperature is the biggest issue for data centre owners in the Middle East. Trying to keep a data centre environment cool, when the outside temperatures are pushing over 50 degrees centigrade can be a major headache for regional enterprises.

Another problem that is often overlooked when talking about cooling in the Middle East and that is the cost of energy. Energy is plentiful and subsidised because the oil is a state resource.

“Costs per kWh across the region average out at around 9 US cents. As a headline figure, that might not seem that much less than a very large UK, European or US carrier might pay on a long term contract. However, once you strip out sales taxes and, in places such as the UK, carbon emissions tax, it works out at less than 50% of the cost in other areas.

“This low cost of energy means that the main impetus for energy reduction - cost - has little to no impact here. As a result, solutions that require a high CAPEX in order to deploy are not favoured because the benefits and ROI are seen as being extremely marginal,” says Sid Deshpande, senior research analyst at IT market research firm, Gartner.

However, the cost of data centre cooling is still high, it is estimated that within 18 months, the cost to power and cool a server is equal to the initial cost of purchasing the server. Green cooling methods are proving to be popular to reduce these costs globally, according to Carrie Higbie, Siemon’s global director of data centre solutions and services. Direct sunlight can also be a serious problem for a data centre.

“Direct sunlight can also be a data centre killer. External mechanical infrastructure will be directly under this intense force for prolonged periods of time, putting these systems under great stress and strain,” explains Peter Hopton, CTO and founder of international data centre cooling experts Iceotope. One solution is to move the data centre to a cooler climate, or invest in co-location with a provider in a cooler country, but for many companies, this is simply not practical or desirable. The second option is to look for alternative new technologies that can maintain the optimum temperature for your data centre and reduce costs.

Cooling innovations
The use of outside air cooling, also known as free air cooling, is gaining acceptance in many areas. Rear door heat exchangers, which passively place cooling capacity closer to equipment where the most heat is generated, are also becoming more popular. While opinions surrounding data centre containment continue to vary among hot aisle and cold aisle proponents, innovations in data centre containment solutions continue to mature.

“I believe the real innovation in data centre cooling is the ability to measure and monitor efficiency and make adjustments based on automation, load and demand,” states Higbie. “Indirect evaporative cooling shows some promise in hotter climates like the Middle East, and I foresee the next few years yielding several new technologies across all regions.”

The new data centre cooling solutions are also seeing an increased interest from large data centre owners. The primary drivers of this high interest is high density and high performance server and storage infrastructure, rising energy and real estate costs and an increased focus on long term green and sustainability initiatives.

One of the emerging areas in this space is liquid cooling, something most organisations are aware of but actual implementations are not in proportion to the levels of interest.

“Liquid cooling solutions become more cost effective than traditional air cooling solutions as power per rack density increases. The different types of liquids used for cooling are water, water +additives, glycol, refrigerants and high dielectric fluids. While liquid cooling is not new, the rise in density per rack and evolution of coolant technologies are some of the drivers for increased interest,” says Deshpande.

According to Sundeep Raina, regional sales manager at global IT hardware manufacturer and IT service provider Chatsworth Products International, globally, there are various approaches to building data centres without chiller plants.

“This has been accomplished by very different strategies. For example, using ocean, river or lake water to remove heat from a plate and frame heat exchanger connected to the data centre cooling loop has helped some data centres to eliminate chiller plants. Others have done it with free air cooling and then sometimes operating inside one of the ASHRAE allowable envelopes rather than strictly living inside the recommended environmental envelope. When we see server OEMs delivering IT equipment according to the A1, A2, A3 and A4 ASHRAE TC9.9 categories, then data centre operators will be in a much better position for making total cost of ownership [TCO] decisions on equipment purchases and operating parametres,” he said.

Add a Comment

Your display name This field is mandatory

Your e-mail address This field is mandatory (Your e-mail address won't be published)

Security code