Trying to plan your infrastructure requirements in a widely fluctuating market is difficult enough, but data centre managers who use DCIM tools may be ahead of the game.
Today's data centre market is undergoing rapid change, which is why people like Ross Hammond are posing the question: how well do you really know your data centre?
As the director of telecom business at Emerson Network Power in Australia and New Zealand, Hammond has seen this change first hand.
"Gone are the days when steady market rises and falls were the norm; today's market is challenging the status quo,” he says.
"Project cycles are getting shorter, and user expectations are changing as a result. The rapid uptake of mobile technologies and cloud means determination of requirements for future capacity require a fundamental shift in mindset.
"Throw in the need to lower power costs in a slowing economy while also lowering your emissions wherever possible, and the market isn't nearly the same as it was five or 10 years ago.”
Hammond says it all points to the need for fluidity in infrastructure on a scale not seen before in Australia.
"For starters, don't make the mistake of putting a pause on your build plans,” he says.
"I regularly speak to CIOs that have their finger on the pulse of how their business is growing, trying to determine which way to direct their investments in infrastructure and capacity.
"What we do know is that the market is not slowing down.”
In fact, Hammond says it's compressing, but generally there is a positive view of the market in the short term.
"This is something we haven't seen since before the global financial crisis,” he says.
"What we're seeing points very strongly to the need for fluidity in infrastructure.
"Yes, there will be rapid contractions just as often as escalations in demand, but companies can't just stop investing. If they do, they risk losing their competitive advantage.”
So what should companies do? Hammond's first suggestion is to take a closer look at data centre infrastructure management (DCIM).
"DCIM tools these days can test energy outputs on a rack-by-rack and even server-by-server level, crucial for departments with budgetary constraints which need to figure out how their money is spent, and just as importantly to assist in greening the data centre,” Hammond says.
DCIM infuses the entire ICT ecosystem in a way that optimises current infrastructure but also enables visibility and planning for future requirements.
"It affords business executives the opportunity to be less concerned with the nuts and bolts of what they don't have and more concerned about what can be done with what they do have,” Hammond says.
"Using DCIM tools also enables the lowering of emissions and the reduction of energy outputs, as you can see exactly what is going on in your data centre at that granular level, which allows you to modify how the data centre operates on a rack-by-rack or server-by-server basis.”
For those building a new data centre, Hammond says another trend to consider is the move towards the modular data centre.
This is where the data centre is built off-site, shipped on the back of a truck and is scalable.
IT analysis firm Gartner estimates the DCIM market will reach $1.7 billion by 2016, a huge leap considering the market is currently worth $450 million.
The market is considered a late bloomer because data centre managers didn't buy in to the first-generation infrastructure tools, which were considered too limited in their scope.
That has changed with the introduction of new tools that can realise hundreds of thousands of dollars in energy and operational costs simply by introducing small process changes.
By correlating power, cooling and space resources to individual servers, DCIM tools today can proactively identify infrastructure problems and how they might affect specific IT loads.
Gartner reports that DCIM tools can reduce operating expenditure by as much as 20 per cent and extend the life of a data centre by as much as five years, by maximising current power infrastructure.
Gartner says DCIM has the unique ability to identify stranded power, helping managers avoid additional equipment purchases.
These tools can also be used to reduce cooling costs.
The general trend today is to run computer equipment and data centres at hotter levels, based on new standards developed by ASHRAE.
But to maintain a margin of safety, companies must monitor temperatures in real-time. Another factor is visibility. Operators must keep tabs on a multitude of IT infrastructure components, including power, cooling, HVAC and other facilities-related equipment.