Get Your Money Out of Storage
- From CIO Digest, July 2009 Issue (Download This Entire Issue in PDF)
It’s a reality no IT decision maker likes: the need for storage is expected to grow at two to three times the rate of IT budgets in coming years.
Some organizations, however, are finding ways to not only increase their control over storage growth, but even reclaim up to six and seven figures in costs from current storage investments.What do their strategies have in common? “The biggest overall issue in storage to deal with first is visibility,” observes Steve Duplessie, founder and senior analyst at the Enterprise Strategy Group, a leading technology industry analyst firm. “You need to answer ‘what do I have?’ The second issue is ‘why should I have it?’ And finally, it’s looking at how to change processes to gain efficiency.”
What strategies in storage control are working best? How might you benefit from them? To find out, CIO Digest spoke with several key IT decision makers to learn how they’re maximizing storage capacity—and avoiding substantial costs or turning up storage budget savings that they can redirect elsewhere.
Like many of his IT peers, Doron Ytshaki is astonished at the rate of storage growth.
He’s chief technology officer for Clalit Health Systems, Israel’s largest HMO. Over 3.8 million people turn to Clalit for care at 14 hospitals and more than 1,300 healthcare clinics. At 200 terabytes, the organization’s data store is growing quickly, which poses management challenges. “Planning is also a challenge,” Ytshaki notes. “We try to purchase storage for a year ahead. We add 16 terabytes and think it will last a year, but it’s used up in three months.”
Ytshaki is searching for an automated tool that will help the team locate and reclaim unused disk space and move older, less important data off primary disk. “Right now, we manually scan all the files in our system to age them and try to move them off to less expensive storage,” he says. “It’s a very time-consuming process.”
Half a world away in Singapore, Lai Loong Fong had similar challenges—and found strategies that solved them. As deputy director at Singapore’s A*STAR Computational Resource Centre, Loong Fong leads a team that manages storage resources for 22 scientific research institutes. Just five staff administer over a petabyte of data that’s expanding at 50 percent per year and resides on 20 storage systems from six major vendors.
To gain visibility as well as simplify and centralize management processes, Loong Fong sought an automated storage management and reporting solution that could handle heterogeneous environments. He chose Veritas CommandCentral Storage.
Streamlining management with centralized, standardized processes has reduced administration time by 25 percent, he reports. Most importantly, gaining visibility into the 20 storage systems has enabled his team to identify and reclaim 30 terabytes of wasted SATA disk space. At about $8.40 per gigabyte, that adds up to $258,000 worth of disk space this is now available.
Increased visibility has other benefits. “We also have improved reliability since we are now able to immediately identify any connectivity problems with the SAN switches or any form of failure on any of the individual arrays,” Loong Fong adds. “We have a heightened awareness of our storage environment, and we’re confident that we can catch any issues before they become big problems. This is particularly valuable in a heterogeneous environment such as ours.”
Is thin provisioning a must?
A*STAR is deploying a second strategy to further reduce costs: thin provisioning.
“You are nuts not to use thin provisioning,” observes Enterprise Strategy Group’s Duplessie. “It’s ‘just in time’ and eliminates forced inefficiencies—like over allocating or provisioning. Not all types of thin provisioning are the same—you need to be smart and careful—but it’s hard to argue with the concept.”
A*STAR’s Loong Fong agrees. He plans to use thin provisioning and utility storage to enable his organization to give up ownership of its storage systems to the storage vendors. A*STAR will pay only for disk space actually used. That will be a smaller amount of space because of thin provisioning than it would with traditional provisioning. “We foresee at least a 10 percent reduction in total storage cost of ownership,” Loong Fong explains.
To review disk usage and monitor the storage vendor’s charges, the team will use Veritas Command- Central Enterprise Reporter. Loong Fong plans to have the new storage business model in place by the end of 2009.
Deduplication: 80 percent less backup data
But don’t stop there; there are other ways to reduce storage costs.
Back in Israel, Clalit faced a challenge shared by many widely distributed organizations: consistent, reliable backups. At more than 700 of its clinics, servers were backed up locally to tape drives. However, the process was expensive and not as reliable as needed.
Ytshaki and team responded by evaluating deduplication technologies. They wanted to deduplicate data at the source. If that process could reduce data volume enough, they could centralize backups over their 256 Kbps WAN to Clalit’s Tel Aviv data center.
NetBackup with its NetBackup PureDisk deduplication technology met their requirements. It has reduced the volume of data that needs to be backed up by 80 percent. “For example, if all of our 300 clinics have about five terabytes of data to back up, we store about one terabyte in our data center because our deduplication technology is reducing data at the source—meaning in the clinics—and NetBackup PureDisk is also deduplicating at the target—our data center.”
Backup time is also much faster. When Clalit backed up to tape, a number of clinics were exceeding their seven-hour backup windows. However, with deduplication and backup over the WAN, the backup window is being met easily “and recoveries are much more reliable,” Ytshaki reports.
Adds Enterprise Strategy Group’s Duplessie: “We’ve seen staggering reduction rates from deduplication, which vary of course, but seeing a 5- to 50-time reduction is right in the range. Deduplication can and should happen all throughout the lifecycle of data, which offers clear benefits. Where exactly to do it or how depends on your situation, but the closer to the point of origin, the better in terms of overall benefit.”
Archiving unnecessary data
To cut storage costs further, ask three simple but profound questions, Duplessie advises:
Question #1: Is data still changing?
Question #2: If no, is the data still frequently accessed?
Question #3: If no, should I continue to treat the data the same way?
“These simple questions will make a huge difference—whether the data is unnecessary or absolutely necessary,” Duplessie explains. “Treatment is the issue, and simple changes in treatment can have big effects.”
This is true at Clalit. The Microsoft Exchange databases were full of old messages that were rarely accessed. The system department team tested archiving solutions and deployed Symantec Enterprise Vault to archive messages older than three weeks, moving them from primary storage to secondary storage.
Since first tier Fibre Channel disk costs about $10 a gigabyte, and second-tier SATA storage costs about $2 a gigabyte, migrating old data can make a profound difference. Clalit can potentially move 20 terabytes of email off primary disk— reclaiming $60,000 in disk space.
Archived messages still appear to be in user’s inboxes, and users can retrieve them with a double click. As a result, archived users stay within their mailbox quotas and have stopped using troublesome PST files. Clalit plans to roll out Enterprise Vault to the rest of its email accounts and implementan archival solution for SAP data.
Simplified storage saves lives
Just about every organization would like to overhaul and upgrade its storage—but how can the investment be justified?
Mississippi Baptist Health Systems, with more than 600 beds, is the largest private general hospital in Mississippi. It was able to make a significant investment in a storage infrastructure by demonstrating that the new design has the potential to save lives. In addition, archived patient information can be retrieved in minutes instead of hours and the new infrastructure, unlike the old, is ready to scale and be compliant for years ahead.
“A few years ago, we had hundreds of servers in our data center, and each had its own storage,” recalls Becky Carruth, the hospital’s director of information services. “Ninety-five percent of our storage was wasted. We didn’t have redundancy or scalability. Backup systems were siloed and retrieval was slow. But in a few short years, Jimmy Touchstone and other senior network engineers have been able to plan and deploy a storage make-over that has been very successful.”
“We’re in the life-saving business,” explains Touchstone, a senior systems engineer. “We must get the information to caregivers at the speed of ‘right now’ because the decisions they have to make may save a person’s life.
|< Previous Page||Page||5||of||5|
Mississippi Baptist worked with Symantec Partner IBM and Symantec to develop a comprehensive storage make-over that could be implemented in phases. The first phase moved Tier 0 and Tier 1 data to an IBM System Storage DS8300 enterprise disk storage system. Medical imaging data was consolidated on an IBM System Storage N5200 disk storage system.
Phase two enhanced backup and archiving. It consolidated five backup libraries and three backup software tools onto an IBM System Storage TS3500 tape library with hierarchical storage management (HSM) software. “We now have one, big, high-speed storage environment, and Veritas NetBackup and Symantec Enterprise Vault allowed us to get our archive into it,” Touchstone explains. “The TS3500 just holds data—our compliancy archive, including electronic medical records (EMRs)—protected in a 256-bit encrypted, write-once-readmany (WORM) format. That means we can ship tapes offsite securely for disaster protection.”
Tape vs. disk saves
The team made a strategic decision to make tape, not disk, the foundation of the archive. They upgraded the tape library to high-performance IBM System Storage TS1120 drives. The speed of those tape drives make all the difference according to Touchstone. “We’ve seen archived medical studies that used to take from one hour to up to 24 hours to retrieve for a caregiver go down to a minute or a minute and a half,” Touchstone says. This type of speed can make a big difference in a department like surgery.”
Using tape instead of disk saved money, Touchstone reports. “Another storage vendor’s proposal to use disk would have cost $900,000 more than the approach we took to leverage tape with the IBM hardware and Veritas NetBackup solution,” he explains. “What’s more, by deploying NetBackup and Enterprise Vault with the IBM tape system, we’ve been able to consolidate not just one archive solution but every one the hospital uses. Four are in the system now, and I’m working on the fifth. We have 100 terabytes on tape, and the retrieval speed is virtually the same as if it were on disk. And my administrative time for the system is only about two hours a week.”
How did the team sell such a comprehensive make-over at a non-profit hospital? “It was broken up in phases so we could get it done during different fiscal years,” Carruth explains. “And we were able to show that if we spent this much right now, down the line we wouldn’t have to re-engineer this solution. It was a long-term investment in a scalable foundation that wouldn’t have to be forklifted later.”
Adds Touchstone: “An IBM filer holds our medical imaging studies. What keeps us from having to add more filers is the ability of Enterprise Vault to take a huge medical imaging study, create a stub of it, and move it off to tape so the software thinks it’s on the filer when it’s on tape. When it’s needed, it’s brought back to the filer where the software can read it. That’s a cost-efficient strategy. Every time we avoid having to purchase another filer, we’re saving another $350,000.”
Set a retention policy, then delete and reclaim
Another way to reduce storage costs is to establish a retention policy and to delete old data. The Screen Actors Guild – Producers Pension and Health Plans (SAGPH), based in Burbank, California, is about to do just that, and it has plenty of old data. SAGPH processes over 700,000 pension and healthcare claims a year and covers over 40,000 actors and their dependents.
“We’re about to publish our records retention policy,” says Kevin Donnellan, chief information officer at SAGPH. “Coming up with it has been a very involved process that included polling all the business units. Until now, we’ve kept all records indefinitely.”
The new policy establishes retention periods ranging from six months to 12 years, depending on the type of information in the document. The retention period will be automatically enforced by Oracle Universal Records Management (URM), software which applies policies to content in file systems, content management systems, and email archives.
“As our retention policies go into effect, they could enable us to delete as much as 10 to 15 percent of our data store a year,” Donnellan projects. “That means we could avoid purchasing two to three terabytes a year, at $25,000 per terabyte. Payback will be in six months. And having a retention policy that is consistently enforced across all record types will dramatically reduce our exposure to legal and compliance risks.”
Standardize and centralize backup to reduce costs
Another way that SAGPH reduced storage costs is by streamlining backup procedures. Ten years ago the organization backed up individual servers using local cartridge drives. Backup administrators were spending 16 hours each week managing backups, and the system wasn’t scalable.
In 2002, Donnellan and his team standardized backup and recovery on Veritas NetBackup, which centralized management of backup and recovery across the organization’s HP-UX, Red Hat Enterprise Linux, and Microsoft Windows operating environments and provided needed scalability. Forty percent of one IT administrator’s time became available for more strategic tasks, worth $293,000 for the years 2000 through 2008.
The ultimate strategy: start from scratch
When you need to think in innovative ways about how to solve your storage challenges and reduce costs, Duplessie offers a simple tip: walk away from your infrastructure. Think outside your boxes. Start with a clean slate.
“Re-evaluate your assumptions,” he explains. “Go to the white board and draw up a workflow as if you didn’t have all the ‘junk’ you really have. How would you draw up data treatment policies if you were starting from scratch? Try it. It’s enlightening.”
Alan Drummer is creative director for content at NAVAJO Company. His work has appeared in the Los Angeles Times, San Francisco Examiner, Create Magazine, and on The History Channel.