Volver

Anatomy of a Data Centre - When Hotter is Greener

Analysis: How to build a modern data centre. Rackspace and Digital Realty keep it simple.

 

There are many ways one can think of a purpose built commercial data centre.

If you are a cloud customer they are the physical buildings in which the data on which your business depends is securely located.

If you are national politician (and you happen to have one in your constituency) they are welcome investment and vital hubs of the digital economy.

If you are an IT professional they are warehouses for IT equipment and a place where you never go.

If you are a design, mechanical or electrical engineer they are a huge set of problems to be solved in getting the power in to hungry servers, storage and network equipment with as little loss in electricity as possible and removing the heat from said equipment using as low power a method as possible. To a mechanical engineer it´s a giant fridge. Recent years have seen incredible strides in engineering expertise and innovation that have gone into data centre design and build. Few industry sectors can be said to have made such advances in adopting new technologies and methods to reduce power consumption without sacrificing resilience. Resilience and efficiency being of almost equal importance. 

(I was thinking a lot about this as I toured the newly opened Rackspace facility in Crawley, Sussex - there was an actual ribbon cutting. I had been on the same site less than 15 months ago when it was just a large mud patch with some heavy construction plant dotted about.) The facility was supplied by Digital Realty, the global data centre finance and engineering firm. Digital Realty designed and built this facility for a single client, the cloud services player Rackspace. It could eventually hold 50,000 physical servers. The data centre was designed for simplicity. 

One of the interesting aspects for the supplier, Digital Realty and the client, Rackspace was the level of collaboration on the project. Rackspace knows what it wants. In any 15 month building project there were always going to be strains. Rackspace has been operating data centres for a long time so it knows what works. Digital Realty has been building data centres for a long time and it too, knows what works. In the end, after more partnership than hostilities, of which there were some, the first phase has been completed.

Mark Roenigk, COO, Rackspace said: "This data centre is the epitome of intelligent 21st century infrastructure engineering. We partnered with industry leaders to design and deliver one of the most environmentally friendly and reliable data centres in Europe. Our customers depend on us for their mission critical managed IT services and this new data centre furthers our commitment to delivering world class services We are proud of the energy efficiency achieved with the innovative design that will become the starting point for boosting the adoption of more efficient technologies in the UK and Europe." 

The initial build was constructed in 15 months and took approximately 500,000 man hours. It provides 6MW capacity across two data suites, and will eventually comprise of four data suites with a total 12MW capacity. The site allows for further expansion up to 30MW, and a staggered build approach maximises year-on-year advances in technology and efficiency gains. William Stein, Digital Realty´s Chief Executive Officer, commented: "With the addition of the Crawley site, the Digital Realty - Rackspace collaboration, which began in 2011, has been extended to a third continent. We are delighted to see Rackspace establish its new managed cloud data centre with such outstanding eco-credentials. With competition growing for facility services across the UK and Europe, we are pleased Rackspace choose Digital Realty as a provider to collaborate on this bespoke facility."

 

The build out

The first 3MW suite is a single floor of four data halls. All of the supporting electrical, switchgear, uninterruptible power supplies, cooling and back-up generator equipment is on the roof. When operating a data centre the first concern is getting the power to the IT equipment. This is spoken about in terms of power density. In this case the average power density is around 3.5kw per cabinet - the racks are 58U high so can take a lot of servers. There are other areas which may house converged systems such as VMware VBlocks or high density storage equipment. Here, the power density may need to reach 7 or 8kw. So the trick is to have this power delivered and converted from 11,000 volts to 130 volts while losing as little as is possible.

The power to the site comes from separate sub stations and is fed to the switchgear on the roof, this is then fed through the transformerless UPS and down through an overhead Bus Bar in the data hall and ultimately to the equipment. The data hall is where the rows of racks with cabinets house the IT equipment. Picture 50,000 industry standard servers - it won´t be all servers of course there will be top of rack switches and storage hard disk drive and flash arrays. So if you picture a fully loaded data hall of servers, storage, and networking equipment running at 75 per cent of capacity on a hot summers day - try to image the heat generated. 

There are many ways to cool a data centre. The old method was to run air conditioning units blowing cold air vaguely in the direction of the IT equipment and having fans extract that air to the outside. This, of course, is not efficient. The method chosen by Digital Realty was indirect adiabactic cooling. This used a system from UK manufacturer Excool. Factory built cooling units are deployed on the roof above the data hall. The data hall is laid out so that the IT equipment itself sucks in cooled air using the fans that are already built into the hardware. 

This air goes through the equipment into a sealed hot aisle - picture back to back rows of servers with space inbetween for engineers to walk, each 36 cabinets long and 58U high. These two rows are entirely sealed with only the outside face of the IT equipment visible. Access to the hot aisle and the back of the equipment is through self closing doors. This enclosure reaches the ceiling of the hall.

The air in this hot aisle is pulled up through the ceiling to the roof mounted cooling unit where it is cooled and sent back to the data hall. This air is not mixed with outside air. It remains sealed. This air is cooled through non evaporative cooling via a heat exchanger - the heated air is cooler than the ambient air. On the hottest of days the cooling air is additionally cooled using fine water sprays. 

Why the bother? This is known as a ´free cooling´ system and the effort that goes into all of this is because air conditioning - so called mechanical cooling and chillers - is incredibly expensive to run. So what you are left with is a data hall with a temperature of 23 to 24 degrees C. Again this is different and adds to the green credentials of the facility because hotter is greener and Rackspace is confident it can safely run its equipment at these temperatures. The data halls can also spike to temperatures as high as 27 degrees C for short periods.

The Opex savings from the cooling design that Rackspace expects go up to £1.2m. The Power Usage Effectiveness, a measurement of data centre efficiency based on electricity to the building over the electricity used for IT operations is a design PUE of 1.15.

The lower the opex cost through lower power consumption the greener the data centre and potentially the lower the cost of service to end users.

The first 3MW suite - there are four halls in the phase one suite of the site - is now open.

The site is connected to Rackspace´s London metro fibre ring and to the main European DWDM network.

The plan is to start the load in the next couple of months and for it to be full by September. In parallel hall 2 in Suite 1 will be commissioned and so on until Phase 1 of the campus reaches its 12MW capacity over the two suites. Then it is on to Phase 2.

Volver