Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.

Method of backing up remote sites

Created: 04 Jun 2013 • Updated: 19 Jun 2013 | 8 comments
Yet's picture
This issue has been solved. See solution.

Hi!

I have Netbackup 7.5 environment with Appliance 5220 media server. Everything is OK however, my dilema is backing up rmeote sites. Link on those sites is not that good. I thought of enabling client dedupe but result is still not ok, 90% its failing due to link error.

Any other way I can back them up aside the bandwidth approach?

 

Operating Systems:

Comments 8 CommentsJump to latest comment

Nicolai's picture

Client side de-dupe is a very good choice. However if the link is bad Netbackup will always fail.

Try setting up a ping test from the 5220 to one of the remote client. If you see a large amount of failed pings the line is defect and network should be notified

This command will ping 999 ping with a size of 8K per ping to the host srv1

ping -n 999 -l 8192 srv1.acme.com

Assumption is the mother of all mess ups.

If this post answered your'e qustion -  Please mark as a soloution.

SOLUTION
Yet's picture

Thanks.

Network issue ont his case is a known fact, afraid can't do much about it.

Would enabling Accelerator makes any difference?

Brook Humphrey's picture

So netbackup accelerator will not reduce the ammount of data that you pass from the client. 

As stated above client side dedupe is the best for that. 

However if your rate of change is low and not much changes on the client you could try accelerator and you may notice thta it works better for you.

The main advantage of accelerator is that every backup acts like a full when you go to do a restore but it only backs up the same ammount of data as an incremental.

In the case of a bad network connection. If it is a leased line at least you could contact your provider and get them to look into the issues you are having.

If you log a case with support we have a network analyizing tool called appcritical that they can help you run and get a good idea of were and why your network might be having issues so you can address those yourself or with your networking team or as above if it is a leased line then you can work with your provider to get those issues addressed.

Hopefully some time in the future you will see new features added to the netbackup to help with situations like this but for now we can try working with you on troublshooting the network.

Thanks

Brook Humphrey
Managed Backup Service

Principal Backup Administration Specialist

RLeon's picture

I thought of enabling client dedupe but result is still not ok, 90% its failing due to link error.

In additional to client-side-dedup, please also enable the Resilient Network property for the remote clients. It is a feature designed specifically for backing up remote clients to media servers cross "high-latency, low-bandwidth networks such as WANs".
Basically, it prevents a job from freaking out and failing just because a packet or two goes missing.

 

Enabling this feature would consume more system resources on both the media server and the clients being backed up. It is a trade off between that or having a job failing altogether.

"A resilient connection can survive network interruptions of up to 80 seconds."
If your remote connection often cause the clients to lose connection for more than 80 seconds, then this Resilient Network feature may not be able to help much.

For more information, please refer to the Nbu 7.5 admin guide vol1 under the topic Resilient Network properties.

huanglao2002's picture

agree with RLeon ,you can try to test Resilient Network function on your site.

Yet's picture

Thank you guys for all the responses.

I'm pushing hard now to have a better link. once that materialize, I'll implement all your recommendations.

I'm not sure if i can mark more than 1 solution, if not I'll just mark the 1st one. hope that's ok for everyone. :-)

again, thanks everyone.

 

Nicolai's picture

You can now request "split solution" after the last Connect update. If you open a new discussion you should be able to see the "request split solution".

Best Regards

Nicolai

Assumption is the mother of all mess ups.

If this post answered your'e qustion -  Please mark as a soloution.

bak's picture

I know this post has already been marked as solved, but I thought I would add a couple of points from my experience with a couple of slow links to remote sites where we are using client-side deduplication. We have some sites that have fractional T1 links, and they are so small and remote no one wants to fund the bandwidth.

You can "Jump-Start" client-side deduplication IF you have a copy of some of the files that you would find at the remote site.

For example, say you have 20 Windows 2008R2 servers at the remote site, you could backup a vanilla 2008R2 server that has a fast link, and thereby adding all of those Windows objects into the dedup pool so they do not have to be re-sent for the remote jobs.

Also, I ended up tanking a remote site link with an appliance swap out, ended up having to seed the new appliances with backup images from the previous appliances (via bpduplicate).

Once you have a FULL of everything, your daily jobs will probably run fine as long as the change rate is relatively low. Consider getting a full of your clients one-at-a-time, then once you do, you can turn them all on.