There’s little question that virtualization is the single biggest game-changing trend in IT today. But virtualization can make information management more complex, particularly as it relates to storage. Virtualization makes both storage management and storage-intense operations such as backup and recovery more difficult, leading to even lower storage utilization rates. Given that most companies have historically had storage utilization rates of 40% to 50%, the value of addressing this problem becomes even more relevant.
Without a doubt, virtualization is the biggest game-changing trend in IT today. As Gartner Inc. put it recently: “Virtualization is the highest-impact issue changing infrastructure and operations through 2012. It will change how you manage, how and what you buy, how you deploy, how you plan, and how you charge.” 1
But as more and more IT departments are finding out, virtualization can make information management more complex. Again Gartner: “Server virtualization solutions and projects often [reduce] storage visibility, management, utilization and subsequently increase storage costs.” 2
This article describes how IT departments can realize the promise of virtualization by applying a service management framework to their storage infrastructure and utilizing new backup technologies that not only reduce storage and network consumption but simplify operations. This two-prong approach will enable them to improve storage visibility and utilization and eliminate bottlenecks from backup and recovery operations.
Most organizations have been moving toward more manageable storage infrastructures by eliminating storage “silos” and leveraging technologies such as networked storage, storage virtualization, and standardized tools and processes for managing heterogeneous storage platforms. Server virtualization now introduces new storage management challenges.
With virtualization, storage is delivered as if it originated from a single device, even though several devices are actually being used. This practice presents a fundamental paradigm shift.
The storage management team now requires deeper visibility of the entire path from host to virtual storage, as well as the capability to map the virtual storage to the physical storage devices. For administrators, the manual tracking of physical to logical resource allocations is a daunting task.
In short, storage virtualization adds another layer of abstraction that is beneficial for storage management but presents greater complexity in managing the relationships between servers and storage.
What’s needed is a view into the data path that provides a full understanding of how applications are spread across the physical and virtual disk resources for direct-attached storage, network-attached storage, and SAN-attached storage. This enables storage administrators to identify potential bottlenecks or failure points in a data path, regardless of complexity.
According to a recent survey by Applied Research
, more than half of all organizations expected to spend more on storage this year than they did in 2008. But at the same time, data from TheInfoPro
indicates that storage utilization hovers at just 35%. Given the current economic climate, this state of affairs can’t be allowed to continue.
Moreover, if there’s one absolute in business, it’s that data will continue to grow. That’s why organizations need to optimize their data centers by more effectively utilizing their storage assets.
As a result, visibility of resource consumption in the data center is essential. Before analysis of the storage environment can be accomplished, IT professionals must be able to accurately track the availability and usage of resources. This information is critical not only in determining how much storage is currently available, but also in providing insight into any additional storage requirements.
By leveraging tools that provide end-to-end visibility of the virtual and physical environments, IT departments can reclaim underused storage and increase storage utilization.
Standardizing a key infrastructure process such as storage management not only helps control and reduce costs, but also helps improve service levels. As this key process is standardized, organizations can realize cost savings from virtualization as well as from the standardization process itself.
Standardized processes also reduce the time required for application deployment and change management. Presenting the necessary information required for hardware management in the context of the change process reduces both time and the possibility of errors. Management becomes a standardized process as opposed to a specialized task.
The benefits of standardizing a mixed physical and virtual environment include the ability to:
- Define policy per application tier
- View defined policies
- Detect configuration ‘drift’
- Benchmark against best practices
Symantec strongly believes in aligning storage operations with the business by enabling global, customized enterprise reports of data center resources across multiple geographic locations. A clear analysis of consumed and unconsumed storage resources mapped to the business enables organizations to implement prudent capacity planning practices. Aligning storage operations with the business also means organizations have the ability to implement service levels and to charge back for storage services rendered to the business.
from Symantec is a standard-based software solution that integrates storage capacity management, storage resource management, performance and policy management, and business reporting to ensure that the storage infrastructure is utilized properly and runs efficiently.
CommandCentral provides centralized visibility and control across physical and virtual heterogeneous storage environments, allowing IT organizations to make better use of existing storage resources.
Ultimately, CommandCentral enables organizations to deliver storage as a service and ensure that their infrastructure investments are aligned with business needs while leading to improvements in productivity and profitability.
Understanding how your team protects information in virtual machines for local and disaster recovery purposes can help you quickly identify sources of wasted storage and infrastructure resources. The right platform should do more than automate virtual machine data protection; it should provide a consistent backup and recovery approach for both physical and virtual servers, integrate with the latest hypervisor APIs, and offer data deduplication options.
Depending on which data protection platform you use, you may be forced to choose between two approaches for virtual machine backup and recovery. Neither approach is ideal, and each has specific performance and storage implications.
You can choose to perform a traditional backup with an agent in the guest OS (the virtual machine). Or you can back up the entire virtual machine as a single file and avoid placing agents in each virtual machine. Unfortunately, client-based backups negatively impact other virtual machines on the same physical host and do not offer quick system recovery. By contrast, backing up the entire virtual machine can be less impactful, but results in lots of duplicated data and does not provide visibility into what was backed up, thereby slowing any recover request. To address this problem, some backup products as well as companies perform both types of backups, wasting even more storage. Finally, remember that any wasted backup storage usually gets amplified (or doubled) because most companies make two copies of backup data to meet local and disaster recovery needs.
solves both the storage and data recovery problems associated with traditional virtual machine backup approaches. These solutions use patent-pending technology that allows a single-pass, image-based backup to also deliver granular file recovery. Companies already performing both client and virtual machine image backups can expect up to a 50% reduction in storage. And for companies using only one backup approach, this technology still offers better recovery features and more efficient storage usage. But this is only one part of how Symantec protects virtual server information.
Customers can also use integrated deduplication in NetBackup to reap greater infrastructure and storage benefits.
At its essence, deduplication is about shrinking the “footprint” required by your data. It can be done at the “source” before the data ever leaves a server or at the “target” where backup data is stored. Although the technology has existed for some time now, most organizations have yet to take advantage of the operational and storage efficiencies to be gained through deduplication.
Symantec’s deduplication strategy encourages organization to apply deduplication everywhere by following a three-step approach.
- Reduce data everywhere by moving deduplication technology closer to information sourcesc
- Reduce deduplication complexity by providing centralized management for all forms of deduplication, both by Symantec and partner technologies
- Reduce infrastructure by using deduplication to improve the return on server virtualization by providing storage consolidation, efficient virtual server protection, and simplified management.
As you consider whether you could reap both infrastructure and storage benefits from data deduplication, remember that these benefits extend beyond the data center to improve remote office and disaster recovery sites as well.
Without a doubt, virtualization is the biggest game-changing trend in IT today. But as more and more IT departments are finding out, virtualization can make information management more complex. What’s more, poorly utilized or managed storage and inefficient data protection processes can prevent IT organizations from fully realizing the potential of virtualization.
By applying a service management framework to the storage infrastructure, Veritas CommandCentral from Symantec enables organizations to gain end-to-end visibility into their storage environment, accurately track the availability and usage of existing resources, automatically monitor changes across all elements of the data path, and attain business-level reporting.
Symantec NetBackup, meanwhile, offers advanced data protection for virtual machines, including advanced integration with VMware and Microsoft Hyper-V and both source and target deduplication. The integration points eliminate scripts, simplify administration, and improve recovery speed. More importantly, this solution can minimize the impact of backup operations on virtual environments and dramatically reduce storage and network resource consumption without the addition of new tools.
The bottom line: Symantec enables IT to improve storage visibility and utilization and eliminate backup and recovery bottlenecks associated with their virtual environments.
- 1 “Virtualization Changes Virtually Everything,” Gartner, March 2008
- 2 “Gartner Magic Quadrant for Storage Resource Management and SAN Management Software,” Gartner, June 2009