Video Screencast Help

dedupe concurrent jobs limits question

Created: 11 Jan 2013 • Updated: 11 Jan 2013 | 6 comments
David Hood's picture
This issue has been solved. See solution.

I know that for a dedupe store there are is a limit on the maximum concurrent jobs setting. 

My question is what happens to a backup job if the dedupe folder is being currently used by to the maximum setting - does the job fail or does it get queued?

to explain better:

I have a new media server which has 8 cores and 192GB of ram, so it is fairly chunky. We are wanting to backup our servers to a dedupe store on this. We have approximatley 60 servers.

For arguments sake lets say i set the maximum concurrent jobs to 10. If I set the backup jobs for these servers to run at 7PM, 60 servers will try and backup to the store at 7PM. I'm assuming that 10 will "win" and start to backup - but what will happen to the other 50 jobs? will they be queued then start once one of the orginal servers completes - or will they fail?

I would really not want to have to create 60 jobs with different start times if possible as the management and scheduling of that would be a nightmare.

Can anyone help? I hope i have explained it ok.



Comments 6 CommentsJump to latest comment

Kiran Bandi's picture

Other 50 jobs will be queued as BE will be waiting for suitable device to start backup. 

Gurvinder Rait's picture

The 50 would be show up with the status ready;no idle devices available and would start up when there is a device available. The BE Dedupe Option works like a RL in the background and if you say you set a concurrency of 10 then one would be reserved for a changer and it will show 10 drive. Example if Dedupe is the name of the folder then you would see jobs going to Dedupe:1, Dedupe:2 etc. and when one of the device is free the next job would use it to mount the media and the job should start. No need to schedule them at a different time.

Let me know if this helps.

pkh's picture

While the other jobs are in the queue, their wait time is ticking away.  Make sure that you set the wait time long enough so that the queued jobs do not time out.

Gurvinder Rait's picture

Started 3 Jobs at the same time with dedupe concurrency set to 2 and here are the results

David Hood's picture

wow - thanks very much for all your replies guys. You are all very helpful indeed!



teiva-boy's picture

Up to16 is what the GUI allows in BackupExec.

You're limited by much more than CPU/RAM.  Your disk too needs to be able to handle the sustained write throughput.

There is an online portal, save yourself the long hold times. Create ticket online, then call in with ticket # in hand :-) "We backup data to restore, we don't backup data just to back it up."