Keeping your cool

The Middle East is one of the more difficult countries to site a data centre, with issues such as extreme heat making building and maintaining a data centre particularly challenging.

Tags: Cannon Technologies (www.cannontech.co.uk)Chatsworth Products International (www.chatsworth.com)Gartner IncorporationIceotope (www.iceotope.com/)Siemon
  • E-Mail
Keeping your cool Sundeep Raina, regional sales manager, Chatsworth Products International.
By  Georgina Enzer Published  February 4, 2013

Other areas of innovation include developments based on traditional air-based cooling and chiller technology, as well as evaporative system.

“In the most basic of terms, air cooling works by pumping air through the facility and over the IT equipment. Liquid cooling instead uses water or a similar liquid material at the server level. Liquid cooled servers are either suspended into a pool, or placed inside a sealed container which ensures the computer is in constant direct contact with the non-conductive liquid,” explains Hopton.

These distinct methods can be narrowed down even further. Air-cooling can be separated into direct or indirect cooling systems which either make use of a heat exchange system or a naturally cool climate.

Liquid cooling can be split between cold and warm liquid cooling options, with reference to the input/output temperatures of the liquid which is around 13/18 degrees centigrade for cold liquid cooling and 45/50 degrees centigrade for warm.

“With cooling becoming such a massive issue within the IT industry, all these avenues are in constant development and are seeing radical improvements made every day,” Hopton adds.

Regional innovations
James Coughlan, director of Cannon Technologies, Middle East says that the region is not seeing any new innovations around cooling, but is seeing a significant take-up in existing, already proven cooling technologies.

“The four main technologies we see in the region are aisle containment, in-row cooling, chilled water and, for the smaller data centres with a limited number of racks, Direct eXpansion cooling,” he states.

Aisle containment is focused on the cold rather than the hot aisle, the main reason being that a power failure would quickly lead to a significant heat surge in the hot aisle and this brings the risk of thermal runaway. By containing the cold aisle and working to the ASHRAE guideline of 24 degrees centigrade, there is room to deal with an emergency without minimal risk to equipment.

In-row cooling allows for very targeted cooling and is especially effective at dealing with power intensive workloads such as blade servers which create large amounts of heat. Chilled water cooling is increasingly being seen in high power environments such as blade servers. In the Middle East, this technology has limited use because of the problems of cooling the water. Where it is used, it can require expensive CAPEX investment to ensure that the water can be effectively cooled, which often means using coolants such as Glycol rather than a cooling tower. Direct expansion is often favoured by data centres with a limited amount of equipment. It is relatively easy to install but has two significant drawbacks.

The first is that the equipment is complex and this means that there is a high overhead in maintenance. The second is that it uses coolants which need to be very carefully handled.

According to Raina, the Middle East region doesn’t get the liberty of using free air cooling as the European, American and other regions do and  therefore the data centre managers, the  CIO/CFOs should be looking at ways that help in doing more with less energy usage. Some places in Middle East still do not include some sort of air containment system in medium ‘greenfield’ designs to high density data centres,” he explains.

The cost of operational expense is often overlooked versus the cost of capital expense. Ultimately these CIOs and CFOs end up spending significantly more money on OPEX related to cooling that could be easily avoided.

There is also a need for deploying right physical infrastructure from day one for devices that are going to come in the market at a later stage and that are likely to push higher temperatures into the data centre.

“The cabinets in a data centre can no longer be looked upon as boxes that are used to store the IT equipment. They form an integral part of the data centre architecture as an extension of the cooling system,” says Raina.

Add a Comment

Your display name This field is mandatory

Your e-mail address This field is mandatory (Your e-mail address won't be published)

Security code