Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.

Scanning File List for Restore Takes Forever

Created: 04 Jan 2007 • Updated: 21 May 2010 | 17 comments
Randy Samora's picture

I have an installation of NBU 5.1 MP5 in an isolated test environment. I use the test lab to import production backup tapes and test restores from those tapes. When I initiate a restore, it takes 2 or 3 minutes for each step. If I want to deselect a file 3 folders deep, for example, it takes 2 or 3 minutes to drop down each folder level. My production database is nearing 1/2TB and it doesn't take that long. Is this the normal behavior of an imported tape or is there something else I can try to speed things up? I tried installing 6.0 but it reacted the same way.

Thanks,
Randy

Comments 17 CommentsJump to latest comment

Stumpr2's picture

Hello Randy,

It's been awhile. Been busy?


> in an isolated test environment....

First thing that jumps out to me is that I would check is name resolution. Perhaps it is going through some timeouts before actually resolving the names and addresses.

You know you don't have to use the GUI to do restores. The command bprestore works very fast. But if you must use the GUI then you could pre-populate the search directory to start at the lowest level folder.

The bpimport should not cause any delays like you are seeing.

VERITAS ain't it the truth?

Randy Samora's picture

Busy? Our data has grown so rapidly and I have Engineers who see a SAN and think, "Oh cool, I can create 2TB volumes now and I don't have to manage so many different drives." Have you tried backing up a Windows server on a single stream with 2TB of data within a 48 hour weekend window? But I'm sure I'm preaching to the choir.

I think you nailed the issue. I was fooling around with the restores and I added an entry in the hosts file for the client and the Master. The folders started popping up instantly. A couple of years ago I had a DNS issue in the test lab and I recall the same kind of stuff started happening so now I'm looking at my DNS. If nothing else, NetBackup has always made a great network troubleshooting tool. If there's something wrong with your network, NetBackup will find it.

DavidParker's picture

> Have you tried backing up a
> Windows server on a single stream with 2TB of data
> within a 48 hour weekend window? But I'm sure I'm
> preaching to the choir.
>

I have a couple servers like that too.
Fortunately they let me run with multiple streams (4 at a time), but it still takes a good 36 hours to get all 2TB of the data onto tape.

Fortunately they've reached a bit of a limit between the server, HBA and SVC and can't ask for any more disk space until they do some updates. =)

Randy Samora's picture

What process are you going through to grab multiple streams? I have one client with 5 folders at the root and I can easily create 5 streams in the file list. I have another client where I literally had to go in and create a stream for A*, B* . . 1*, 2* ....etc. for each alpha and numeric character. My fear is that someone is going to get on that first client I mentioned and create a 6th folder and I'm not going to hear about it until they want something restored.

Randy Samora's picture

I was right, how do I give myself points :) I only have a handful of test servers so after I restore a client and verify that it was successful, I rebuild for the next restore. The DNS cache wasn't cleaning up after itself and it still saw some old names assigned to the same IP addresses. Just FYI in case anyone ever runs into something like this.

Stumpr2's picture

Randy,

Are you asking for a script that can be scheduled to run prior to the backup to:
1. determine existing folder names,
2. build a NEW_STREAM and folder listing for use in the "Backup Selections"
3. modify the policy to use the new Backup Selections

This sounds like a noble goal and I would be interested in such a script as well.
any takers?Message was edited by:
Bob Stump

VERITAS ain't it the truth?

Randy Samora's picture

I saw that little diploma next to your name and figured it was worth asking. I guess they didn't give you a magic wand with the diploma?

Stumpr2's picture

chuckle, no Dang it!

In your previous question could you simply get by using as the "Backup Selections" "

H:\Foldername\*

VERITAS ain't it the truth?

Randy Samora's picture

The folders are generated by the application and there are thousands of them. The folder names are hexadecimal so I'm fortunate in that there will never be a folder above F*. There are new folders every day with a 3 character name.

DavidParker's picture

Bob's got it (as usual) ...
You can use the '*' operator in the backup selection.
That'll keep you safe in the event a 6th folder gets added to that location.

Randy Samora's picture

When I originally created the policy for the volume that had 2TB of data, I created a separate line entry for each alpha and numeric character; 36 lines in all. I wanted each stream to be a separate job so that if a job failed, I didn't lose everything and just folders beginning with that character had to be rerun. That was my intent. I created 36 line entries WITHOUT the NEW_STREAM entry before each line. I kicked off the job manually expecting to see 36 new jobs in the Activity Monitor. 3 hours later I was still trying to cancel jobs and clean up the mess. And don't get me started on the hate mail I got from the guys who receive an email every time a job fails.

DavidParker's picture

Doh!

Yeah, I've been yelled at by Exchange and SQL DBA's before for similar things.
It's always fun!

But yeah, using the '*' made this current setup work quite well; I get 1 job per sub-folder and then I have the policy restricted to only run 4 jobs at a time. Bump the backup window time up and it works like a champ.

We're hoping to move some of their data onto the E-Vault system we have (currently only used on email), so hopefully that will make my life even easier in the near future.

Randy Samora's picture

I was considering writing my memoirs of my experiences with women and sending that off with my son to college. Now I'm thinking of writing my memoirs of my experience with backups. Trying to decide which would get the better laugh.

Stumpr2's picture

LOL - Sorry Randy!

> hate mail I got from the guys who receive an email every time a job fails.

They know not of what they ask. It's even worse for the poor blokes that want the email sent to their pager or blackberry.

I hear you man!

VERITAS ain't it the truth?

DavidParker's picture

Hey, if they have to be sooooo connected as to have a 'Crack-berry', that's their problem, I say ;)

Personally, I'm just waiting for someone to start a webcomic about the daily tribulations of a Backup Admin ...
"My server crashed last night and it's YOUR fault!"

Randy Samora's picture

I hate using up these resources to joke around but I just made myself laugh out loud with some of the ideas for the webcomic.

I deleted a file this afternoon and I need it restored.
(After an hour searching the database) When did you create the file?
This morning.

Or...."Where was the file located?" "On my C: drive." or "I don't know."
What was it called? "Something like Budget-something or 2006 Budget items or something like that."

send me an email at randy.samora@stewart.com with your ideas and I'll work on this. Has anyone seen James? I"m sure he has some input. Am I going to get in trouble for posting my email address? Might work out better than the on line dating deal :)

DavidParker's picture

Absolutely HILLARIOUS, Randy!
I've seen each and every one of those cases too!
So true ...

Oh, how about the times when the files haven't actually been deleted but have been 'accidentally' moved by someone? "Ooops, how did that get there?"

I don't think anyone will complain about some mild social use of the forum; at least it's vaguely on topic still ...

:)