Video Screencast Help

Backup Exec 2014 - differential backup of deduplicated, DFS-replicated share

Created: 11 Jun 2014 | 3 comments

We have a fileserver running Windows Server 2012. The data volumes are deduplicated, and replicated to a standby server with DFS-R. (This combination is supported by Microsoft.)

A full backup is working fine (though it obviously does an "unoptimized" backup, that is, Backup Exec undeduplicates the files during backup. (One volume has 3.4 TB of data that allocate 1.7 TB of disk space. A full backup with Backup Exec 2014 backs up 3.4 TB.)

Now, I would like to take a differential backup during weekdays, but I cannot get this working reasonably. When I use the normal full/differential job combination, the differential backups are way too big:

  • Weekend full backup: 3.4 TB
  • Monday differential backup: 1.8 TB
  • Tuesday differential backup: 3.2 TB
  • Wednesday differential backup: 3.3 TB

(The actual amount of changed files should be less than 100 GB.)

 

I had the idea to use a time-based differential backup, so I tried to set up a job that backups all files modified within last 21 days. But I can't see any option in Backup Exec 2014 to do so. The only related option is in file selections, there I can exclude "files not accessed within X days." But this does not work for me.

When I set up a job for this volume and use the "files not accessed within 21 days" option, the job fails with:

V-79-57344-37925 - Snapshot Technology: Initialization failure on: "Shadow?Copy?Components". Snapshot technology used: Microsoft Volume Shadow Copy Service (VSS).

Snapshot technology error (0xE0009425): The Shadow Copy Components file system does not contain any data to back up. Edit the job, remove the Shadow Copy Components selection, and then run the job again.

To make sure that the job definition is ok otherwise, I removed the "files not accessed within 21 days" exclusion, and in that case, the job works perfectly (but backups the whole drive of course).

 

So ... is there any possibility to create a reasonable differential backup of a deduplicated, DFS-replicated drive? Or do I have to create some robocopy job and then backup the copy ........

 

Operating Systems:

Comments 3 CommentsJump to latest comment

Colin Weaver's picture

Are you backing up from the share or from the Shadow Copy Components of the DFS-R data?

 

 

stephan.vanhelden@uponor.com's picture

I'm backing up from Shadow Copy Components. If I select the G: drive instead, I get a "successful" backup of 0 files in 0 folders (due it is DFS-R replicated).

Or do you mean I should create a new share (like sharing G:\ as G-BACKUP) and then back up \\fileserver\g-backup ??

stephan.vanhelden@uponor.com's picture

Ok, I think I've found the mystery, but I don't have a solution. Windows deduplication sets the archive bit whenever it optimizes a file.

This may even make sense. If someone does an optimized backup (that is, he backups the chunks and reparse points that are actually present on the disk), a differential backup must backup all changed chunks and reparse points. This would not be very much data, too (only a lot of files.)

But since BE does an unoptimized backup, it considers all these files changed and undeduplicates them and backups them.

I have a similar issue on a regular file share (which is not DFS-replicated) - there I modified the job to use the "modified time" method for the differential backup. This may help. But there is no such selection for DFS-R backups that use the Shadow Copy store ...