Print Page   |   Contact Us   |   Report Abuse   |   Sign In   |   Join the DCA
DCA Member Search
Event Calendar

10/03/2017 » 28/12/2017
Training & Education Courses in 2017

05/04/2017 » 31/12/2017
Site Access Control & Security

06/06/2017 » 08/06/2017
Datacloud Europe 2017

Top Contributors

DCA Member's Blog
Blog Home All Blogs
Please post data centre industry comments, experiences, ideas, questions, queries and goings-on here - to stay informed with updates, ensure you are subscribed by clicking the "subscribe" button.

 

Search all posts for:   

 

Top tags: Date Centre  Datacentre  efficiency  EMKA UK  central  Cooling  data  data centre  London  cloud  data centre security  Location  pue  connectivity  EMKA  energy-efficient computing  swinghandles  LDeX Group  air management  anti contamination  BioLock  data centre cleaning  data centre security;  disaster recovery  EU CODE of Conduct  infrastructure  planning  power  university of east london  building 

Air-side free-cooling: direct or indirect systems and their impact on PUE

Posted By Robert Tozer, Operational Intelligence Limited, 01 October 2015

There is a general perception that direct air systems are more cost efficient and hence the default option. However, providing the design incorporates good air segregation, recommended ASHRAE equipment environmental conditions and adiabatic cooling of the outdoor air, for most cities warmer than London, the indirect air systems are considerably more efficient than the direct air systems. This is because many more hours of free cooling can be achieved with adiabatic cooling without affecting the indoor conditions. Furthermore, zero-refrigeration in most of the world is possible with this solution. For cooler climates, direct systems are only marginally more efficient.

Often when data centre free cooling is discussed, people assume this means direct fresh air cooling. However, in climates warmer than London, indirect air systems are more efficient than direct air systems and can allow refrigeration to be eliminated and considerably reduce the electrical plant sizing requirements. Use of adiabatic evaporative cooling on the outdoor airstream allows free cooling to be achieved for many more hours in the year when there are hot, dry conditions. Further detail on the application for free cooling in data centres is available in our technical papers.

 

Tags:  air management  Cooling  Date Centre  Location  London  pue 

Share |
PermalinkComments (0)
 

PUE Record busted!

Posted By John Booth, Carbon3IT, 01 April 2015

Reversed PUE Trend in Europe

 

Hot on the heels of the EURECA project that is designed to improve energy efficiency of data centres in the Public Sector, one European country has beaten them to it! The secret government data centre, which can’t be named due to security concerns has published PUE results of 0.2 that are completely the reverse of the industry average of 2.0. Dilbert Watkins the consultant who worked on the project said “This data centre is the most efficient yet, when we turn it on we will also be using waste heat by heating up our breakfasts on specially designed fittings that fit on the reverse of the racks.

The DCA look forward to finding out more about the facility for a future article.

 

 

Tags:  energy-efficient computing  eureca  pue 

Share |
PermalinkComments (2)
 

A guide to data centre metrics and standards for start-ups and SMBs

Posted By Anne-Marie Lavelle, London Data Exchange, 27 March 2014
Updated: 27 March 2014

A guide to data centre metrics and standards for start-ups and SMBs

Having made that choice to co-locate your organisation’s servers and infrastructure to a trusted data centre provider, companies need to be able to understand the key metrics and standards which they should use to evaluate and benchmark each data centre operator against. With so many terms to get to grips with and understand, we felt it necessary to address the most prevalent ones for data centres.

Green Grid has developed a series of metrics to encourage greater energy efficiency within the data centre. Here are the top seven which we think you’ll find most useful.

PUE: The most common metric used to show how efficiently data centres are using their energy would have to be Power Usage Effectiveness. Essentially, it’s a ratio of how much energy consumption is it going to take to run a data centre’s IT and servers. This would incorporate things like UPS systems, cooling systems, chillers, HVAC for the computer room, air handlers and data centre lighting for instance vs how much energy is going to run the overall data centre which would be taking into account monitor usage, workstations, switches, the list goes on.

Ideally a data centre’s PUE would be 1.0, which means 100% of energy is used by the computing devices in the data centre – and not on things like lighting, cooling or workstations. LDeX for instance uses below 1.35 which means that for every watt of energy used by the servers, .30 of a watt is being used for data centre cooling and lighting making very little of its energy being used for cooling and power conversion.

CUE: Carbon Usage Effectiveness also developed by The Green Grid complements PUE and looks at the carbon emissions associated with operating a data centre. To understand it better you look at the total carbon emissions due to the energy consumption of the data centre and divide it by the energy consumption of the data centre’s servers and IT equipment. The metric is expressed in kilograms of carbon dioxide (kgCO2eq) per kilowatt-hour (kWh), and if a data centre is 100-percent powered by clean energy, it will have a CUE of zero. It provides a great way in determining ways to improve a data centre’s sustainability, how data centre operators are improving designs and processes over time. LDeX is run on 100% renewable electricity from Scottish Power.

WUE: Water Usage Effectiveness simply calculates how well data centres are using within its facilities. The WUE is a ratio of the annual water usage to how much energy is being consumed by the IT equipment and servers, and is expressed in litres/kilowatt-hour (L/kWh). Like CUE, the ideal value of WUE is zero, for no water was used to operate the data centre. LDeX does not operate chilled water cooling meaning that we do not use water to run our data centre facility.

Power SLAs: Service Level Agreements is the compensation offered in the unlikely event that power provided by the data centre operator to a client as part of an agreement is lost and service is interrupted affecting your company’s business. The last thing your business wants is to have people being unable to access your company’s website and if power gets cut out from your rack for some reason, make sure you have measures in place.

Data centres refer to the Uptime Institute for guidance with regards to meeting standards for any downtime. The difference between 99.671%, 99.741%, 99.982%, and 99.995%, while seemingly nominal, could be significant depending on the application. Whilst no down-time is ideal, the tier system allows the below durations for services to be unavailable within one year (525,600 minutes):

  • Tier 1 (99.671%) status would allow 1729.224 minutes
  • Tier 2 (99.741%) status would allow 1361.304 minutes
  • Tier 3 (99.982%) status would allow 94.608 minutes
  • Tier 4 (99.995%) status would allow 26.28 minutes

LDeX has infrastructure resilience rated at Tier 3 status offering customers peace of mind that in the unlikely event of an outage–and therefore protecting your business. We like to operate closed control cooling systems in our facilities enabling us to operate tight environmental parameter SLA’s. LDeX operate SLA Cold Aisle Temperature parameters at 23degreesC +/- 3 degreesC and RH (Relative Humidity) 35% – 60%.

Some data centres run fresh air cooling systems which make it hard to regulate RH and quite often their RH parameters are 205 – 80% and beyond. This can lead to increased humidity in the data hall and has on occasion resulted in rust on server components or a low RH can produce static electricity within the data hall. Make sure you look into this and ask about it.

Understand the ISO standards that matter to your business

ISO 50001 – Energy management

Using energy efficiently helps organisations save money as well as helping to conserve resources and tackle climate change. ISO 50001 supports organisations in all sectors to use energy more efficiently, through the development of an energy management system (EnMS).

ISO 50001:2011 provides a framework of requirements for organizations to:

  • Develop a policy for more efficient use of energy
  • Fix targets and objectives to meet the policy
  • Use data to better understand and make decisions about energy use
  • Measure the results
  • Review how well the policy works, and
  • Continually improve energy management

ISO 27001 – Information Security Management

Keeping your company’s intellectual property should be a top priority for your business and ensuring that your data centre provider offers this sort of resilience is imperative. The ISO 27000 family of standards helps organizations keep information assets secure.

Using this will help your organization manage the security of assets such as financial information, intellectual property, employee details or information entrusted to you by third parties.

ISO/IEC 27001 is the best-known standard in the family providing requirements for an information security management system (ISMS). An ISMS is a systematic approach to managing sensitive company information so that it remains secure. It includes people, processes and IT systems by applying a risk management process.

It can help small, medium and large businesses in any sector keep information assets secure.

Like other ISO management system standards, certification to ISO/IEC 27001 is possible but not obligatory. Some organizations choose to implement the standard in order to benefit from the best practice it contains while others decide they also want to get certified to reassure customers and clients that its recommendations have been followed. ISO does not perform certification.

PCI DSS – Banks and businesses alike conduct a lot of transactions over the internet. With this in mind, the PCI Security Standards Council (SSC) developed a set of international security standards to ensure that service providers and merchants have payment protection whether from a debit, credit or company purchasing card. As of 1st January 2015, PCI DSS 2.0 will become mandatory. This will be broken down into 12 requirements ranging from vulnerability assessments to encrypting data. Make sure to ask if your data centre operator has this standard.

With the increased stakeholder scrutiny that has been placed on data centres, steps need to be put in place to make sure that the data centre operator that you are looking at choosing is aligning its strategy not only to some of these metrics and standards mention, but to other security, environmental governmental regulations that have been brought in.

Working for a successful data centre and network services provider like LDeX has enabled me as a relative newbie to the data centre industry, to get to grips with these terms to facilitate client understanding regarding where LDeX sits in comparison with our competitors.

Anne-Marie Lavelle, Group Marketing Executive at LDeX Group

Tags:  connectivity  Cooling  CUE  data centre  Datacentre  Date Centre  efficiency  ISO standards  operational best practice  PCI DSS  PUE  WUE 

Share |
PermalinkComments (0)
 

What energy efficiency metrics to use?

Posted By Barry Paton, 05 December 2013

Dear all,

I'm trying to answer that question as part of my masters degree with the Open University.

The aim of the research project is to design a state-of-the-art monitoring dashboard that will help to improve energy efficiency in data centres.

I've created a survey to gather information on metrics from industry experts.

The survey can be accessed here: http://eSurv.org?s=OCHJMM_3e884f22

Please take a few minutes to share your knowledge and experience.

All participants will receive a copy of the final thesis on request. Be assured that no personal data will be reported in the results.

Thanks and Best Regards,

Barry

Tags:  efficiency  energy-efficient computing  PUE 

Share |
PermalinkComments (0)
 

Own or Out-source, Build or Buy?

Posted By Steve Hone, 01 August 2012

These are questions I am often asked, I came across this article the other day on the subject and wanted to share it with you, Posted originally by Nicholas Greene a year or so ago but still very relivent today……..

For one reason or another, your enterprise organization's got some big computing needs right over the horizon. Maybe you're setting up a new consumer payment or accounts management platform. Maybe you've just developed the next best online game, and you need servers to host it. Maybe you just need some additional storage. Whatever the reason, you're gonna need a Data Center. One question remains, though- should you outsource, or build?

Constructing a Data Center's no mean task, as you well know- it's a positively herculean undertaking which brings with it overwhelming costs and an exhaustive time commitment just to construct it- never mind maintaining it after the fact. If you're going to build a Data Facility, you'd better make damned sure your business can handle it. If you don't, you'll flounder- it's simply reality.

There are a lot of things you have to consider- cost and budget, employees, time constraints…you know the drill. Today, we're going to take a closer look at the first entry on the list-the reason for setting up a facility- and use it as a springboard in determining when you should outsource, and when the management of a facility should be placed solely in your organization's hands.

Ultimately, you have three choices- it comes down to whether or not you want to outsource to a multi-purpose data vendor, construct your own purpose-built facility, or hire a contractor to custom-tailor a facility for you. Before we even get started, I'm going to say right out the door that most businesses are better off going with the first or third option.

To determine what choice is right for you, there are a few things you should consider. What does your business do? What shall you be using the facility for, and how intensive will your needs be? How important are the tasks you require the facility for? Are they key components of your business strategy, or of one leg of your corporation?

What your business does can play a considerable role in determining whether or not you'll run your own servers. Is your organization solely based in the technology sector, or is your primary area of expertise in finance. Are you a hardware or software vendor, or do you primarily sell consumer products? How large is your IT department? How well-funded are they? All of these questions should be taken into account, as they can very well help determine right out the door if your business is even capable of managing its own facility without some significant restructuring, let alone building one.

Of course, that's only the first thing you need to consider- what your organization does by no means restricts it from constructing its own centers- Facebook's a prime example of this. Of course, in their case, they have their own reasons for building their own servers- they are, after all, the world's largest and best-known social network.

As I've already stated, what you need the facility for also plays a very important role. If you are, for example, a cloud-based SAAS vendor, it should go entirely without saying that you should be building and managing your own facility. As a general rule, if you expect to turn a significant profit from your facility, or the need met by the facility comprises a key aspect of your business model, you should look at running your own- or, at the very least, get yourself a custom-built data center.

Bandwidth goes hand in hand with purpose. How many gigabytes of data is your product or service going to use? How will the costs of development and management stack up against the fees you'd be paying if you outsourced? Will you turn enough of a profit to merit setting up your own facility?

Do you foresee yourself needing to expand your facility in the future? How will you do it? Scalability's an important concern, and if your business can't afford to expand- or virtualize- outsourcing might be the answer. Size matters, folks, and the smaller your business, the more likely you are to need to contract out, rather than run your own center.

Finally, there's your staff. Is it more economical to train and hire a whole team of new employees, or simply contract out to an organization to manage things for you?

Every business is different, and not all organizations are built equal. What I've listed here, all of the information; it's little more than a guideline. Ultimately, the choice of whether or not to outsource rests entirely with you.

Tags:  building  central  comms  cooling  Date Centre  efficiency  Location  planning  PUE 

Share |
PermalinkComments (0)
 

Why Google’s Data Centre is Not Like Yours

Posted By Steve Hone, 01 August 2012

Does your company operate a data centre? It's likely you do, since any organisation that is large enough needs to have its own servers. Yet while you probably purchase from Dell or HP and throw them into a room with a whole bunch of temperature control units, Google decided long ago to rethink the concept of the data center.

They started this process with the servers themselves. They purchase all of their own components and build the servers from scratch. Why? Because the company feels that they can make a better server unit that fits their needs. Instead of a typical server, you get something that looks more like a homemade PC project.

There are tens of thousands of these custom-built severs located around the world. When you do a search or use any Google product, the company takes your IP address and routes you to their closest data centre in order to provide the highest speed (lowest latency) possible. The company has realised the correlation between speed and customer satisfaction and has therefore built enough data centres to accommodate.

The data centres are also configured differently. In order to optimise space and cooling needs, the company packs servers into shipping unit that are then individually cooled. Google's experts have determined that this is the best way to efficiently economise.

Take a look at this tour of such a facility. http://youtu.be/zRwPSFpLX8I

So why are Google's data centres so special? The answer is efficiency. The company uses a ton of power to keep these servers running and doing so at an optimal temperature. The company tries to locate these facilities near hydroelectric power because of its lower cost. It also explains why Google has such an interest in renewable energy and last year entered into a twenty year agreement to buy wind power. The company knows that its power needs are going to increase over time, and this is a way to hedge the fluctuations in energy prices over the years.

Tags:  building  central  comms  cooling  data  Date Centre  efficiency  Location  PUE 

Share |
PermalinkComments (0)
 

PUE - we data centre folk must reign it in..

Posted By Simon Campbell-Whyte, 12 July 2012

By the end of the Leeds Conference last month PUE became the P-word that no-one dared mention...why?

because it became clear over the 20-odd talks about data centres, tech, cooling, energy efficiency etc that nobody could really quote a PUE number that was actually a genuine one.

So did you know that:

PUE is not just Electricty?

- it should include a weighting factor for other energy forms such as gas, fuel oil and different types of water...

it has four catagories of measurement?

- depending on how it is measured, this should be clarified as catagory 0,1, 2, and 3

A PUE number should only refer to 12 months operational data?

- PUE cannot be last night, yesterday, a minute ago or tomorrow or a load test last year.

It was clear from the conference, we need to put our house in order....

 

Tags:  PUE  reign it in 

Share |
PermalinkComments (0)
 
Sign In
Sign In securely
News

Data Centre Alliance

Privacy Policy