Video Screencast Help

Backup Exec backup metrics/statistics

Created: 09 Mar 2010 • Updated: 12 Sep 2010 | 8 comments

Hi all,

Can someone advise where I can download some recent metrics/statistics on the Backup Exec 11 for Windows software?  I want to find out the standard time it takes to backup a certain amount of data, both full and incremental.  That way, I can see how our numbers are matching up against the benchmark.  If someone knows where I can download this informaiton, please advise.

Comments 8 CommentsJump to latest comment

pkh's picture

I don't think that such statistics exists.  The amount of time required to backup a certain amount of data is highly dependent on the particular computing environment.  If you are using a faster server, it backs up faster.  Likewise, a faster tape drive will help things.  Even when the hardware environment is the same, the workload of the server at the time of the backup has an impact on the backup time.  Given the high variability of the bases for such statistics, it will not be meaningful to collect them.  It would be like comparing apples and oranges.

Gary Li's picture

One of our incremental backups last night took 1.5 hours to process 1.7 million files.  It didn't backup anything because nothing new was added.  Now, is 1.5 hour for 1.7 million files a normal statistic? 

CraigV's picture

1.7 million files is a lot of files, and depending on the size of each, BEWS needs to scan each and every file to check if it has changed.
So theoretically, it could take that long. Factors like when the backup is running (during application maintenance), running across a LAN/WAN, type of HDDs used etc. need to be factored in too...

Alternative ways to access Backup Exec Technical Support:

https://www-secure.symantec.com/connect/blogs/alte...

teiva-boy's picture

Well that depends if you are using the "archive bit," or "modified time" using NTFS change journal (preferred)  One is much faster than the other by not having to scan the entire volume at the file level.

There is an online portal, save yourself the long hold times. Create ticket online, then call in with ticket # in hand :-) http://mysupport.symantec.com "We backup data to restore, we don't backup data just to back it up."

Gary Li's picture

I tested couple of backups and the results led me to ask about these metrics.  Backup #1 was for 68 GB of data (800k files).  A full backup completed in 1.5 hours and incremental run (no new data backup) completed in 10 minutes.    Backup #2 was for 128 GB of data (1.7 million files).  A full backup completed in 10 hours and incremental run (no new data backup) completed in 1.5 hours.  The files are all .tiff images.   My question is why such disparity in back up time when the data is just doubled?

teiva-boy's picture

 It's not so much just the size of the data, but also the file count.  RAM, disk speeds, etc all contribute making backup speeds inconsistent throughout it's window.

It's a pain really as you cant really nail down a solid number to predict backup speeds.

There is an online portal, save yourself the long hold times. Create ticket online, then call in with ticket # in hand :-) http://mysupport.symantec.com "We backup data to restore, we don't backup data just to back it up."

Gary Li's picture

You are certainly right on the file count.  But, I didn't expect that by doubling the file count, the full backup time went 6-7 times and incremental 7-8 times over.

teiva-boy's picture

 You could try short stroking your HDD, providing you can predict your growth accurately so as to not run out of space.  That would help contain your files, and use only the outer edges, giving you maximum disk throughput.

For you incr/diff's use modified time with the change journal, that should be faster providing we're talking NTFS.


There is an online portal, save yourself the long hold times. Create ticket online, then call in with ticket # in hand :-) http://mysupport.symantec.com "We backup data to restore, we don't backup data just to back it up."