For anyone still in the dark about the complexity of today's data centers, the Symantec State of the Data Center Report
shines a bright light on the subject. The report, based on surveys and interviews with more than 800 data center managers in Global 2000 organizations late in 2007, shows that data centers are becoming more complex and are likely to become even more so in the next five years:
"Increasingly, businesses are demanding higher application availability and the rapid integration of new technologies. At the same time, the amount of data generated by data center applications is exploding and much of it must be protected, as per new privacy and government regulations, and retained for longer periods of time."
This article explores the primary challenges that data center managers face today, based on the report's findings. It also suggests what these managers can do to better manage complexity and contain growing costs.
The organizations represented in the report had an average of 31,250 employees, and 90% of the firms had between 7,000 and 120,000 employees. The typical number of data centers in these organizations was 14 or 15. The typical annual IT budget was $54 million.
According to the report, today's data center managers face three pervasive – and related – challenges:
- stringent (if not unreasonable) service-level agreements
- ongoing data center growth
- staffing issues
"The study found that roughly two-thirds of data center managers said their data centers are becoming too complex to manage easily. If dealing with this complexity was not enough, more than half of the data center managers surveyed said internal service-level-agreement (SLA) demands are increasing. Simply throwing bodies at the problem is not the solution. The majority of managers said it is getting harder to find qualified staff. And more than half of the managers said their data centers are understaffed."
Adding to the complexity: mission-critical applications are running on a variety of platforms with varying operating systems. For example, the survey found that organizations are using, on average, 964 physical servers to meet their users' demands.
In addition, many managers pointed to the growing pressure to keep systems and applications available 24x7.
Driving this demand is the global nature of the Internet and the international "reach" of many companies. Making matters more challenging is the fact that this high availability now applies even to non-essential applications.
At the same time, managers report that their budgets continue to be relatively flat.
As a result of all these pressures, more and more managers say they are turning to cost-containment technologies, such as server consolidation, virtualization, storage resource management, and data lifecycle management. Others say they are outsourcing routine tasks.
Of particular concern to these managers are issues pertaining to how they manage the massive amounts of data being generated by their organizations. Many managers noted that their storage capacity has been doubling every year with no end in sight. They also noted that the growth of data associated with core business applications is having a significant impact — not just on their backup strategies, but on their approach to disaster recovery as well.
Complicating matters today is the additional consideration that needs to be given to e-discovery, data privacy, and data retention regulations.
With regard to e-discovery, recent amendments to the Federal Rules of Civil Procedure now impose more stringent rules regarding which electronically stored information must be preserved and presented in litigation in U.S. federal courts. Many lower-level courts now use the same guidelines.
The implication for the data center is that organizations need a way to quickly retrieve such information if it is subpoenaed during litigation. Additionally, the cost to produce subpoenaed information can be staggering.
Meanwhile, data privacy laws are causing a different type of problem. Some managers noted that concerns about identity theft and new state, federal, and industry data protection and privacy regulations are making data handling more complicated.
Data retention regulations have been causing changes as well. Organizations must identify information subject to data retention regulations and somehow incorporate retention and expiry policies throughout their storage, backup, and data lifecycle management strategies.
So how do data center managers view the cost-containment technologies they've embraced to address these various issues?
In general, they realize that approaches such as server consolidation, virtualization, and storage resource management can only do so much. But they also say they're dissatisfied with current solutions. In fact, many managers say they are intentionally using limited solutions that do a great job in one particular area at the expense of others.
"For example, many rely on the software provided by their storage device manufacturer for data management," the report observes. "The problem is that such software does not support any other equipment in the data center, not even storage devices from other manufacturers."
What is lacking, according to the managers, is a management solution that gives a complete view of the data center.
What do data managers see when asked about the future? One thing in particular: More complexity.
Primarily that's because most companies are going global, and one system will service different locations, which will only increase the load on data centers. As all operations move to becoming 24x7, more robust recovery and availability solutions must be used to ensure round-the-clock access to applications.
"Companies will be taking different approaches to cope with the continued complexity of their data centers. For instance, some are looking to standards to help. To that end, several focus group attendees and interviewees mentioned adopting ITIL methodologies. Additionally, some noted that in the next five years they would be going for certifications such as TIA-942 and ISO 20000 to help with their overall data center operations. Along those lines, some said they will look for staff with Data Center Foundation Certification and Certified Data Center Professional certifications."
Some respondents also pointed to the need for software standardization to master data center complexity and better utilize current resources. Standardizing on a single layer of infrastructure software that supports all major storage and server hardware platforms would protect information and applications, enhance data center service levels, improve storage and server utilization, and drive down operational cost, they said.
For most of the managers surveyed by the Symantec State of the Data Center Report, complexity and service level demands will continue to increase. As one manager put it: "There will not be any improvement; these challenges will get even more complex."
With some two-thirds of data center managers saying their data centers are becoming too complex to manage easily, and more than half saying internal SLA demands are increasing, today's data centers are clearly under the gun. In fact, many data centers may be described as being at a critical turning point.
What is needed to reduce data center complexity is a standardized approach. Symantec recommends companies standardize on a single layer of infrastructure software that supports all major applications, databases, processors and storage and server hardware platforms. Doing so will better equip them to protect their information and applications, enhance data center service levels, improve storage and server utilization, consistently manage physical and virtual environments, and drive down operational cost.