My Backup Exec Meta Data Index-Catalog is huge (280GB!) - is that normal?
We are running Backup Exec 2012, with a weekly backup of our primary server that has Exchange 2010.
The server has a 1TB drive, and a quick overview of the local partitions shows we are using 415GB.
Question 1 (primary):
The Backup-Full process just completed successfully and the backup set is taking up 407GB.
Immediately following that backup, Backup Exec seems to run a Backup-Full-Meta Data Index-Catalog 000##.
The web tells me this is an index of all the files stored in the aforementioned Backup-Full. That sounds perfectly reasonable.
Less reasonable, perhaps, is that this process (which also completed successfully) seems to be using up an additonal 280GB.
This is posing a problem as we only have a 1TB backup disk, and the bi-weekly differential (weighing in at 100GB, with another 100GB catalog) coupled with a small backup of our Sharepoint database (on a separate vm) is getting us very close to our cap.
Can someone please tell me if the 280GB is normal?
Question 2 (secondary):
The fellow that configured our application also set up two additional backups. One is a full backup of a different VM (the aformentioned Sharepoint databases reside ont hat VM), and another is a backup process for "Server Farm 1". Is this a normal thing that someone can explain to me? I'm relatively new to Backup Exec terminology, so I have a hard time differentiating between Symantec's structural processes and our IT team's configuration decisions. The configuration lists the following backup sets: "ConfigurationV4-DB (Server\Sharepoint\Sharepoint_ConfigurationDatabase)", "Microsoft Sharepoint Foundation Web Application", "Services", "Shared Services", "Sharepoint Foundation Search", and "WSS_Administration". I don't really understand why all these items need to be backed up separate from the full backup of our primary server, so any insight you can offer would be much appreciated.