1. What is Duplication to a Remote Master feature?
Duplication to Remote Master is a NetBackup 7.1 feature where backups on a storage server in one NetBackup domain can be automatically duplicated (replicated) to a storage server on remote NetBackup domain. Once replicated, the images are automatically imported to NetBackup catalog on remote master.
2. What are the minimum requirements to implement Duplication to Remote Master?
Duplication to Remote Master requires NetBackup 7.1 domains. The storage servers at the source and target domains must support OpenStorage version 11 and provide plugins to make use of Duplication to Remote Master. At the time of NetBackup 7.1 release, the Media Server Deduplication Pool (MSDP) supports duplications to remote master. The support for PureDisk Deduplication Pool (PDDO) and NetBackup appliances will be available at a later time. For third party OpenStorage devices, please contact the vendor for details on Duplication to Remote Master support.
3. Do I require any additional license to implement Duplications to Remote Master?
No, there is no additional license requirements specific to the use of Duplications to Remote Master.
4. Can I replicate images from one source domain to multiple target domains?
Yes, you can define multiple destinations for the source storage server at the source domain. Replications to multiple targets can occur concurrently if the backend storage server supports it.
5. Can I replicate images from multiple source domains to a single target domain?
Yes, the same storage server can be the target for multiple source domains. This is a good use case for storage servers at remote offices replicating to central data center. Another use case is where multiple consumers are sending data to a central backup service provider.
6. Can I replicate in both directions between two NetBackup domains?
Yes, this is supported. This is ideal for environments where two productions sites also act as disaster recovery site for one another.
7. What is the difference between traditional image import and automatic image import available with duplication to remote master feature?
Traditional image import is a time consuming process which is done in two phases. In the second phase, the entire image on storage needs to be scanned to regenerate the file meta data for the catalog. The larger the size of image on storage, the longer it takes to import. Additionally, larger the number of files in the image, the longer it takes to regenerate the catalog. The automatic image import available with duplication to remote master is optimized. The file meta data for the backup is part of NetBackup image on storage. During duplication, the meta data is thus shipped to remote master where importing is simply a matter of reading the meta data shipped by the source domain and adding to the catalog at the target domain. The size of the image and number of files on image has little impact on optimized automatic import.
8. What is Remote Retention?
In an inter-domain duplication scenario, the owner of the data is the source domain. The administrator at the source domain decides how long the data should be kept offsite (remote master). This is achieved by a special time of retention level the target domain must have in the storage life cycle policy (SLP) that imports the image. This is called Remote Retention. At least one destination in the import SLP at the target domain must have Remote Retention set.
9. What happens if the remote domain does not have a storage life cycle policy with the same name as the source domain?
The duplications will continue to occur, but images will not get imported at the remote master server. Once the import SLP with matching specifications is created, the images will get imported. Symantec recommends that target import SLPs are created, else the storage server will have images (hence potentially consuming storage space) that will never cleaned up even after the required remote retention period. It is possible to configure the import manager to create required import SLP automatically by setting AUTO_CREATE_IMPORT_SLP to 1 in LIFECYCLE_PARAMETERS file at the remote site.
10. I was told that the storage server at the remote site notifies NetBackup when new image arrives from the source domain. What happens if NetBackup was down at the remote site?
Duplications to remote master is based on OpenStorage 11 events and messages. The event associated with the arrival of new image persist on storage server until NetBackup acknowledges and deletes the event. Hence the image will be processed once NetBackup comes online.
11. What happens if the remote master is down at the time of duplication?
So long as the remote storage server is up and running, duplication would work fine. The import will happen once the remote master comes up.
12. What happens if the media server(s) connected to the storage server at the remote server are down?
If the storage server is an independent device, the duplication would work fine. The import will happen when the media server is up. If the storage server is part of media server (e.g. Media Server Deduplication Pool), the media server needs to be running for duplications to run.