Video Screencast Help

How to get individual dedupe/compression/rawspace rates of clients off your Data Domain appliance

Created: 18 Jun 2013 | 4 comments

Greetings, I wanted to share a script that I created that allows you to get generalized de duplication data from a Data Domain on a per client basis.  I thought it might help out folks in the community.

I have found it very helpful in finding clients that don't dedupe well, especially oracle encrypted/compressed backups.

As with any script of this nature, your mileage may vary.  It works on solaris 10 with DDOS 5.2.x.  The directory structure between DDOS4, DDOS5.0, and DDOS5.2 are a little different, so they will not work on the prior version.  If people are interested I can post the changes you'd have to make for those versions.

This script does rely on the binary dos2unix which is available freely.

You must also have a shared key between the host you run the script on, and the data domain you are querying.  It must be passphraseless, so you don't get continually asked for a password.

Feel free to make suggestions to improve the script, sometimes I'm kind of a hack.

The usage is scriptname.ksh FQDNofDD LSUname

 

#! /bin/ksh
 
#Set Variables
#
#Excecutibles
DOS2UNIX=/usr/bin/dos2unix
 
#Temporary files
if [ -d /tmp/DDTEMP ]
        then
        echo
        else
        mkdir /tmp/DDTEMP/
fi
if [ -d /tmp/DDTEMP/dd880 ]
        then
        echo
        else
        mkdir /tmp/DDTEMP/dd880
fi
TEMP=/tmp/DDTEMP/dd880
report1=$TEMP/var1$2
junk=$TEMP/junk$2
 
#General variables
TEMP=/tmp/DDTEMP/dd880
report1=$TEMP/var1$2
junk=$TEMP/junk$2
EMAIL=someone@somewhere.com
 
#Usage info
if [ "$1" = "" ]
        then
echo "This command must be used with switches"
echo "get_comp_rates.ksh <Data Domain FQDN> <LSU>"
echo "example: ./get_comp_rates.ksh mydd880.emc.com emclsu"
exit 1
fi
#clean up any files lying around
if [ -f $report1 ]
        then
        rm -f $report1
fi
if [ -f $report2 ]
        then
        rm -f $report2
fi
if [ -f $junk ]
        then
        rm -f $junk
fi
#Report for dedupe rates
print "CLIENT                                   TOTALCOMP       ORIGINAL                GCOMP           LCOMP" >> $report1
for i in `ssh -q sysadmin@$1 filesys show compression /data/col1/$2/* |grep -v .boost| awk -F\/ ' { print $5 } '| awk -F_ ' { print $1 } ' | sort -u`
        do ssh -q sysadmin@$1 filesys show compression /data/col1/$2/$i* | grep -v ost | $DOS2UNIX > $junk
              a=`cat $junk | grep "Total files" | awk ' { print $5 } '`
              b=`cat $junk | grep "Original" | awk ' { print $3 } '`
              c=`cat $junk | grep "Globally" | awk ' { print $4 } '`
              d=`cat $junk | grep "Locally" | awk ' { print $4 } '`
                print "$i                       $a              $b              $c      $d"  >>$report1
              done
#Mail report out
cat $report1 | mailx -s "$1 client dedupe rates" $EMAIL
Operating Systems:

Comments 4 CommentsJump to latest comment

StefanoCarrara's picture

 

thank you very much
can be really helpful. Just one question for you: how can you send the password in the data domain via ssh?
 
thanks again
quebek's picture

@StefanoCarrara

Please check this command on your DataDomain system:

adminaccess add ssh-keys [user <username>]
Add an SSH public key, created on a remote machine, to the SSH authorized keys file on
the Data Domain system. The operation allows users to log in without a password.