Has any topic attracted more attention than virtualization in IT departments these days? It’s not likely. Research bellwether Gartner Inc. has dubbed virtualization “the highest-impact issue changing infrastructure and operations through 2012.” ¹
That being the case, it’s not surprising that virtualization introduces new challenges into IT environments as they become increasingly both physical and virtual. This Tech Brief looks at five key challenges that you need to be aware of as you deploy – or continue to deploy – virtualization.
Management software and processes that work in the physical server environment don’t always work in a virtual environment. That can lead to a number of complexities and inefficiencies, such as higher costs for training, software, and operations. According to Symantec’s most recent IT Disaster Recovery Research Report
, 35% of respondents cited “too many different tools” as the biggest challenge in protecting mission-critical data and applications in physical and virtual environments. Managing both environments on one platform with a single set of tools reduces “sprawl.”
Gartner is on the record as stating that “server virtualization solutions and projects often reduced storage visibility, management, utilization and subsequently increased storage costs.” ² The fact is, storage management in virtual environments is more challenging. Make sure your storage management strategy spans both environments and can provide end-to-end visibility, monitoring, analysis, and active testing.
As effective as virtualization is for maximizing server utilization, it can create problems for data protection. The proliferation of servers can make backup configuration more time-consuming, increase storage requirements, and complicate backups and restores. Support for advanced technologies such as off-host backup or block-level incremental backup becomes critical to overcoming the performance and bandwidth constraints associated with virtual environments. Data deduplication can reduce the backup storage required for backups and disaster recovery.
Challenge #4: When applications running in your virtual machines fail, are you alerted?You don’t hear much about it, but virtualization can decrease application visibility and recoverability. That’s because native server virtualization HA tools usually lack the ability to monitor the health of the applications running inside virtual machines. So if there’s an application failure, no action is taken to remediate the problem. Also, native disaster recovery features don’t completely automate recovery at the DR site, and they don’t make failback to the production site easy. This can mean longer downtime, which is unacceptable if these happen to be mission-critical apps. Make sure your HA/DR solution can detect and automate the failover of applications on both virtual and physical servers.
Challenge #5: Do you know what needs to be considered when moving from physical to virtual endpoints?Multiple configurations and computing models are the norm in today’s enterprise. Desktops and laptops, rich clients and thin clients, physical desktops and virtual desktops, shared systems and dedicated systems all have their place. To be successful, endpoint virtualization, the next big wave after server virtualization, must be approached as part of an overall strategy to decrease PC total cost of ownership and increase end-user productivity.
Virtualization can provide a host of benefits – if you do the proper up-front planning. And that means making sure your virtualization strategy doesn’t stint on storage management, data protection, High Availability/Disaster Recovery, and endpoint virtualization.
- ¹ “How to Reduce Your PC TCO 30% in 2011,” Troni, Gammage, Silver, March 20, 2009
- ² “Gartner Magic Quadrant for Storage Resource Management and SAN Management Software,” Filks, Passmore, June 22, 2009