Video Screencast Help

Backup sets

Created: 15 Oct 2012 • Updated: 05 Nov 2012 | 3 comments
L_4's picture
This issue has been solved. See solution.

I am new at my company and there is already a backup in place using Symantec 2010

We have one location that runs the backup software (HQ) and remote sites (that are on slower links) that send the required data back to HQ for backup.

We are using DFS to copy the data from remote sites to a folder on our server that the Symantec program using to backup from. Right now this folder is 880GB in size and from what I have been told this folder is then looked at for changes and then the changes are backed up to tape. This folder is required...

Is there not just a client for Symantec that can be used on the remote servers so that ONLY the changed files are sent, then only these changed files need to be in the HQ folder that is backed up. Is there really a need for a folder almost a TB in size for Symantec to be able to accomplish the job?

thx in advance,


Comments 3 CommentsJump to latest comment

CraigV's picture


You can look at deduplication within BE 2010 which may actually solve your issue. Only new and changed data would be transferred to the media server.

Check below for further information:


Alternative ways to access Backup Exec Technical Support:

L_4's picture

I checked out the video on Deduplication folders. That is similar to what DFS is doing now, isn't it? I also checked and the Symantec software we have now (2010) and dont see a deduplication folder option on the devices page.

Looks like the term I want to use is there not an agaent that can run on the remote servers that will only send the incremental data to my HQ folder for backup to tape? and overwrite this data each day if a change has been made?

I just don't understand why there is a need for this huge folder for the backup. Can the HQ server (Symantec server) not just back up changed data (nightly) to tape and not to a folder?? Collect the changed data, back it up,, remove the changed data??



CraigV's picture

Dedupe & DFS are completely different so don't get the 2 confused.

Dedupe would use the RAWS agent on the local server to send changed data across the network to the media server (client-side dedupe).

The RAWS agent communicates with the media server from the remote server...if you had INCR/DIFF jobs for instance, only the changed data on the server would be sent via that agent!

Alternative ways to access Backup Exec Technical Support: