Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.
Backup and Recovery Community Blog

Updates to the CatMan toolset

Created: 27 Sep 2013
SanjayPM's picture
0 0 Votes
Login to vote

I wanted to share the latest updates on the new version of CatMan toolset. We have now re-architected the data transfer portion of the tools to make it more resilient and reliable. The version with this feature set is 2.5.4. The main changes are as follows:

  1. Bi-directional communications between the source and target: We now use two ports 9998 and 9999 to communicate between the source and target machines. With this bi-directional communications, both the source and target are in sync with each other and each side knows if there are any failures on the other side.
  2. Multiple Re-tries for data chunks: As you already know that we break the data up into multiple chunks of max 10 GB for each chunk. We have implemented multiple re-tries to make sure that if a chunk fails the first time due to network issues, it will re-send that chunk until it succeeds. If after several re-tries, the chunk fails to tranfer to the target, the tool will not exit but list the chunks that failed to transfer for the consultant to look at in greater detail. The tool will continue to the next chuck for transfers.
  3. Cross check to ensure data integrity: For Catalog Manipulation engagements, data integrity is of paramount importance. In order to verify that we have absolutely no data loss between the source and targets, we perform a cross check between the source and target to verify that all the data is exactly the same as expected. If there is any difference in the data sets then that is taken care of before we proceed with any manipulation processes. Most instances, it could be due to images expiring on the source.
  4. Merges done through disk reads: Previously merges were done by merging the relevant EMM tables in memory. For large catalog merges, this resulted in using up all the available system and swap memory. When this was originally designed we were concerned about the memory usage and wanted to monitor it carefully. Based on some real world examples, we do the merges through file reads of the database entries. Some consequences of this is change is the source domain SLPs must now be cancelled or completed. The target SLP only need to be suspended.

This has resulted in the tool being very reliable now.

There have been some questions on Level 4 Engagements -namely appliance to appliance engagements. Our solution for Levels 1-3 with merges and migrations will work the same. Some additional features that will be added included stand alone tool for renaming and changing the role of the appliance from say mater to media or vice versa. In all cases, there will be no movement of MSDP data from the source to the target domains. Stay tuned for more news on this front.