Video Screencast Help
Backup Exec

Ten Backup Mistakes in a Virtual Environment - Part 4

Created: 08 Sep 2010 • Updated: 28 May 2014 • 3 comments
Digital Dave's picture
+1 1 Vote
Login to vote

On to two more backup mistakes in virtual environments that are all-to-common among IT professionals.  If the first five mistakes listed earlier (Part 1, Part 2, Part 3) weren’t enough to convince you that backing up virtual environments is a challenge, here are two more. 
 
Mistake #6: Backing Up to Tape-only (or Disk-only)
Some IT professionals still take a singular approach and use only disk or only tape.  However, most analysts recommend a “balanced” strategy using both disk and tape for backups.  What you should really do is to formulate a disk and tape strategy based on stakeholder’s data management objectives and policies.  A few considerations:

  • Recovery time objectives. (RTO)/Recovery Point Objectives (RPO).  Disk is typically faster for recovery.
  • Archiving requirements.  Tape is the most common archive medium.
  • Service Level Agreements (SLAs).  What is IT obligated to deliver to customers?
  • Existing hardware investments, as well as budget.

 
The sensible strategy is to use disk where performance and flexibility are needed and use tape to reduce some costs.  In most cases, IT will be moving to a disk-based infrastructure.  However, the reality today is that tape is still widely used and should be accounted for in a virtual machine backup strategy. 
 
Mistake #7: Backing Up Redundant Data
There is a lot of duplicate data on virtual machines.  Consider the duplicate data in the OS, particularly if you use a standard image.  It is simply not a wise strategy to backup all of that duplicate data.  It congests the network, lengthens the backup window, and raises storage hardware costs.  It is also completely avoidable.  IT should consider the following strategies:

  • Leverage VMware’s new vSphere block level differential/incremental backup and restore.
  • Implement block-level data deduplication during backup of virtual machines to identify and exclude redundant data

 
The amount of duplicate data in virtual environments is significantly higher than physical environments - duplicate OS, cloned machines, test machines etc., all increase the amount of similar blocks in infrastructure. Deduplication can improve the backup speed.  In the case of virtual machines there is a significant amount of duplicate data, allowing for major space and time savings.
 
In my next post, we will cover two common mistakes that unnecessarily add complexity to your life: treating backup as an island and failing to take full advantage of your SAN.

Comments 3 CommentsJump to latest comment

GeoffRose's picture

OK maybe im jumping the gun here since we are only up to point 7 but I hope that LAN free backups will be covered since they seem a little scary to setup yet can have a great speed benefit. Also there isnt much info around about them so more hints are always welcome!

http://communities.vmware.com/thread/256861?tstart=0
https://www-secure.symantec.com/connect/forums/vm-lan-free-backup

Cheers
Geoff.

+1
Login to vote
Dimitris Peppas's picture

VMWare LAN-free backup's are not much of a big deal to setup.

1. You need to disable automount on the backup server
2. configure the storage to give access of the vmfs luns to the backup server
3. configure the login parameters for vcenter.

Same steps for both Netbackup and BackupExec.

The rest is licensing in order to work.

That's it! Setup the policy and you are ready.

One pass backup's for full image and file level restore

BR,
DP
+1
Login to vote
deepak.vasudevan's picture

Dave, 

I would like to share from my personal experience two other points with respect to these backup strategies:

  1. Missing 6 (a): Storing backup in the same environment.
  2. Missing 6 (b): No Protection for the backup media and is stored in a pile of other CD and related media.

We had an experience in one of my prev organization that a secured backup media went amiss and had to undergo a lot of legal clearances for the same.  Fortunately the percentage of content was just about 1 per cent of the total data. 

Next there should be a criteria when old backups should  be purged. Backups should have a nomenclature so that their obsolescence/relevance can be clearly deciphered.

 

I would appreciate if you could expand your article series into these spheres as well.

+1
Login to vote