Video Screencast Help

Duplicate a backup to another location over the internet

Created: 13 Nov 2011 | 7 comments

Hi everyone,

A collegue of mine has bought a NAS for a customer of us to store backups on another location over the internet.

So basically they have 1 company where the media server is located and the other virtual servers. The customer wanted to have another fail-over and place the server with the CEO so that he always has a backup. Backup to tape is also done as another safety.

Now I need to set this up, so my question to you is: what can be done in Backup Exec 2010 R3 to duplicate it to the other server over the internet. My only concern is the bandwith over the internet. We also do VM backup of the hyper-V host wich has 5 servers running where their harddisk take around 500GB of data. Only a small percentage of this is really used, but backup exec backs up the entire VHD. We also backup the data on these VHD's but I think that the VM's only need to be synchronised to the other location because this contains everything and is just in case the building burns down..

She told me that the option "BE 2010 Advanced Disk-Based Backup" would make it possible to duplicate the backup to another location.

I have already setup to b2db2t scenario, but do I have to also back it up to the remote host? Because this would take a lot of bandwith..

Se said that we made a full backup first on premises and then just to incremental so that bandwith wouldn't be an issue, but we can't just make 1 full backup and the rest would be incremental?!

What are your suggestions on this one?

Comments 7 CommentsJump to latest comment

Calis's picture

Im interested in the same approach Silencer so watching this closely smiley

pkh's picture

ADBO does not offer anything thing that will help to backup over the Internet better.  As you have pointed out, the bandwidth between the two locations is the problem and there is no way around that.  Normally, it is very difficult to succeed doing either a backup or duplicate over a low bandwidth link, a few dropped packets and the job will fail.

If you just do one full backup and then incrementals after that, you will have a big problem when you restore.  You would need to restore the full plus ALL the incremental backups.

The only way to reduce the amount of traffic across a link is to use optimised dedup.  You need to setup a media server on each end of the link with the dedup option.  After you backup with dedup on one end, duplicate that with optimised dedup over the link.  In this scenario, only the changed data blocks would be transmitted across the link.

Bas Lips's picture

Dedupe between media servers is the way to go as ph mentioned.

In general dedupe software is not optimized for WAN connections. So it might not be very effective depending on the speed of the line and the quality. Sometimes bandwidth optimizers do a lot to help the efficiency. THese arent allways cheap, but theres some non-hardware (VM) appliances out there.


Colin Weaver's picture

This might help - but be aware that you might still need a reasonably reliable WAN/VPN connection between the locations

Oh and you do need Backup Exec 2010 R3 with the latest Hotfixes and a CASO environment to enable the private cloud configuration discussed in the document.

Silencer01's picture

Hi everyone,

Thanks for the support and the well written options. I just went through the pdf and it seems that we might have a problem. Currently we have a Dell R510 as a backup server (this one backups to disk and afterwards to tape). My collegue also bought a NX3100 with Storage Server to put on the other end of the internet. The 2 sites will be connected with a permanent VPN-connection.

I saw that you can deduplicate but you need 2 devices which are from the same vendor. I went through the compatibility list but I can't find the NX3100 listed as an OpenStorage Device. Optimised deduplication would be the thing we are looking at! Otherwise a full backup of 500GB each week/month over the internet is a little bit too much considering the bandwith.

We only have a single licence for backup exec 2010 R3 without the CASO. I need to buy another licene for backupexec to put on the other server and then a license for CASO on the main backupserver?

What are my options looking at this hardware?

Thanks again!

Silencer01's picture

Hi everyone,

I did some research on the licenses for the deduplication scenario. Appereantly you need the deduplication option on both the backupservers. There is also a "Backup Exec 2010 Duplication Suite" ( so to be 100% sure: I do not need another duplication license for the cloud server?

I also need a CAS license for the primary backupserver.

Is this possible with the hardware that is bought? And what would you suggest using this hardware?

Silencer01's picture

I just read through the whole document of deduplication and want to verify that I got the ways to backup to another location. The first thing which I noticed is that this scenario is perfect for companies who want to provide off-side backup for all of their customers by using 1 CASO-server which is administered by the IT-company? We will not be using this.

I have also read that you cas use "Client deduplication" (Direct Backup) to backup the data to the cloud services server. Using this scenario, you only need 1 backup-server in the cloud. The remote agents on the different servers which need to be backupped will make sure that only the information which is changed is backupped to the remote backupserver. Is this correct?

Or you can use "media server deduplication" where you need a CASO-server and a managed server. The CASO-server is the server in the cloud and the managed server is the on-premises server. The managed server will do the backup in stead of the remote server agent (like in direct backup). Is this correct?

I only want the full VHD-files be deduplicated on the remote server in the cloud. This is because this contains everything and it is easily restored. If I only backup the data in the VM's, it isn't possible to restore the data in case of a fire and the whole building of the customer burns down? And is it possible with single VHD's files to only upload the data which has changed to a deduplication folder?

Reading this document to process of setting up jobs wasn't fully clear in my opinion. When I choose client backup than it is simple: just define jobs on the server in the cloud and the remote agents will do the backup to this server?

When I choose for the "media server deduplication" (Setting up the multitenant or offsite copy to cloud
configurations) then you have the CASO-server and the managed server. The managed server will just do the normal jobs:

  • Backup to Disk
  • Duplicate this backup to Tape

And then I need another job to backup the content of this folder to a "deduplication storage folder" on the server in the cloud? Then only the changed data will be copied over the internet?

And how do you best decide which server is the CASO-server or the managed server? The server in the cloud would be the CASO-server in this scenario or best the other way around and why exactly?

Thanks already for your help!!