Such utilization rates may be acceptable during periods of robust economic growth, but in tough economic times they simply don’t make sense. Having an array that is only 50% utilized is like paying twice as much for the storage needed. Idle capacity also consumes power, increases cooling costs, and unnecessarily consumes floor space, which is often at a premium.
As a first step, storage managers need to assess the situation and gain an enterprise-wide view of the storage environment. What is the average utilization rate? What is the utilization rate by application? Which applications are growing fastest (or slowest)?
As managers soon learn, however, putting the big picture together is a manually intensive task. In the past, when applications and their data resided on a single system, knowing where information resided and how it was being managed was relatively simple. If an application was running on server A, then the data associated with that application resided on the disk drives attached to server A.
But as environments evolved into distributed architectures where multiple servers and applications share storage and network resources, understanding where information lives and how it is being managed becomes much more complex. In today’s storage environments it is difficult to determine how storage has been allocated across various arrays, which hosts are consuming that storage, how effectively the applications running on those hosts are utilizing that storage, and what type of data is stored there.
A storage resource management (SRM) tool can help storage managers get answers to these questions. It’s been said that managing storage without an SRM tool is like going on a journey without a map.
With an SRM tool like Veritas CommandCentral Storage, managers can assess the storage situation and quickly identify problem areas and consolidation opportunities, and create a priority list of solutions.
Reporting on current operations also accomplishes two things. First, it identifies areas with low utilization that can be fixed immediately at no cost to the bank. Second, periodic benchmarking demonstrates tangible cost savings in ways that both technical and non-technical managers can appreciate.
“Storage has been growing wildly, and there hasn’t been a lot of effort in knowing how to manage it,” says Sean Derrington, Director, Storage Management and High Availability, at Symantec. “But in the current economic cycle, companies want to know how to do it better – how to use processes better and take costs out of the budget. We’re having these conversations now with senior management.”
“We hear companies all the time say, ‘I don’t know what I’ve got,” adds Phil Goodwin, Senior Manager, Product Marketing, at Symantec. “Their No. 1 challenge is getting a handle on storage. With Symantec’s help, they can identify unused capacity, reclaim wasted storage capacity, save information once, and not archive what they don’t need.”
Reducing your data footprint with a centralized archiving solution not only helps rationalize storage resources and dedicate primary (and more expensive) storage to dynamic and transactional data, but it can also lower the costs and risks of e-discovery and compliance with such regulations as SEC Rule 17, NASD 3010, and Sarbanes-Oxley.
For example, the U.S. investment arm of one of the world’s largest financial services companies consolidated its file servers from 575 to 200 with a tiered-storage strategy, resulting in a dramatic reduction in overhead costs and manpower requirements to manage the environment. The institution was also able to lower its annual server purchases by 50%, significantly reduce the time for server administration, and increase overall productivity for more important IT tasks. It is estimated the company has realized overall savings of several million dollars from an increase in overall efficiencies and IT productivity.
Another Symantec customer, a Swiss bank, increased its storage utilization from 12% to 41% with CommandCentral and implemented a charge-back system to let application owners know the storage costs their applications were consuming. As a result, the bank was able to defer storage purchases for a year, putting off buying 9 petabytes of capacity at a cost of $30,000 per terabyte.
In general, there is “intense, accelerated interest” among FSIs in improving their storage utilization rates, particularly as consolidation continues to spread throughout the industry and companies scramble to get their back ends coordinated, Derrington and Goodwin observe.
A relatively new technology that has gained mainstream acceptance is thin provisioning. This allows administrators to allocate storage capacity to an application or a host multiple times in a shared pool concept. Thin provisioning challenges the long-standing storage approach of having to dedicate capacity up front, based on allocation. This results in higher capacity utilization, eliminates the guesswork in new application provision, and reduces capital expenditures and operating costs.
While most major array manufacturers support thin provisioning, the management tools from these manufacturers are highly vendor-specific. IT departments should explore the latest release of Symantec’s Veritas Storage Foundation
solution, which is “thin provisioning aware” and supports all thin provisioning architectures currently available.
Storage Foundation’s SmartMove feature enables hardware-independent online migrations of application information from traditional (or “thick”) volumes to thin provisioned volumes. This capability enables organizations to move quickly into thin provisioned environments and not incur wasted storage capacity.
Data deduplication is another recent technology that enables companies to eliminate duplicate backup data and significantly decrease their storage consumption. For example, if a Microsoft PowerPoint presentation is stored on different file servers multiple times, deduplication ensures that only one copy is stored, no matter how many full or incremental backups occur.
The built-in data deduplication technology of Veritas NetBackup PureDisk
reduces data backup volume by as much as 90% and reduces bandwidth needed by 97%. Microsoft Exchange backup can be reduced by as much as 98%.
According to a recent study by the TANEJA Group
, data deduplication can lead to data reduction ratios of 20:1 or more over time with no data loss.
The recent economic downturn has forced all organizations to seek ways to reduce the costs associated with storing information, and to better understand what information resources exist and why. By gaining insight into current storage assets, where they are located, who is consuming them, how effectively they are being utilized, and what the value of the data is, financial institutions can gain a handle on part of the IT budget that has gone unchecked for far too long.
Of course, financial institutions must also find ways to cut costs this quarter, not 12 to 24 months from now. With Symantec’s help, these institutions can gain insight into their physical and virtual storage environments so they can quickly reduce operational costs, maintain high levels of availability, and eliminate unnecessary storage hardware purchases.