Video Screencast Help
Storage and Availability Management
Showing posts tagged with Storage Foundation
Showing posts in English
Sanjays | 25 Aug 2011 | 0 comments

Exploding storage growth is the biggest problem faced by storage administrators and giving many a sleepless night forcing them to figure out ways to reduce storage and thus control costs. While thin hardware provides the capability to reduce storage, for the most part its storage reduction remains limited because of lack of information at the array level. Veritas Storage Foundation helps admins get the most of their thin hardware by utilizing the intelligence of the file system to make thin provisioning better.

Veritas Storage Foundation helps admins to get thin and stay thin across disparate hardware. It's Smartmove feature helps admins to move data  to thin capable hardware, while its thin reclamation feature helps them keep data growth ini control on an ongoing basis.

 In this white paper "Thin reclamation for the IBM XIV Storage System and IBM Storwize V7000 using Veritas Storage Foundation Enterprise HA" IBM", IBM...

DLamorena | 22 Aug 2011 | 0 comments

Listen to Symantec and VMware leaders, Ashish Yajnik, Symantec Product Manager & Julio Tapia, VMware Director of Alliances Marketing, as they discuss best practices for delivering a cost-effective, high-performing virtual desktop environment. Afraid that virtualization translates into scrapping your existing storage infrastructure? Learn how you can leverage your existing storage investments to rein in costs associated with virtual machine sprawl. You will also learn about how VMware View works with Symantec VirtualStore to serve up virtual desktops and provide  a comprehensive desktop management solution. Details around VirtualStore’s space-optimized snapshots will also be shared.

 

...
Sanjays | 23 Jul 2011 | 0 comments

Cloud: Evolutionary or Revolutionary ?

For many the cloud has been a marketing term with no real products and just a lot of vaporware. But the cloud is finally becoming real and cloud discussions are becoming more mature. On the Gartner hype cycle the cloud is beginning to emerge from the `peak of inflated expectations` and heading to the `trough of disillusionment`  (find more information at: http://www.readwriteweb.com/archives/gartner_hype_cycle_2010_cloud_computing_at_the_pea.php ). With this maturity have come some real questions, Is the cloud an evolutionary or revolutionary step? Does the cloud mean 'net new' or is it just another step in the evolution of the data center?

Tell us more about your journey to the cloud, share with us how you are implementing your cloud, are you taking evolutionary or...

bgoodyear | 18 Jul 2011 | 0 comments

"The future of the datacenter has gotten really cloudy lately..."

It’s the season of IT vendor user conferences.  Wonder what the focus of the conferences is this year?  I’m sure it won’t come as a surprise that every conference has a theme around “The Cloud”.  The most interesting thing to me is that each conference has “The Solution” as well.  If you sit in these conferences for very long, you immediately figure out that you can have your cloud just by using the vendor’s products.  This “cloud in a box” solution sounds really good until you start looking inside their cloud and understand what’s really happening.  By choosing their solution, you just locked into this chosen vendor’s solutions for the cloud.  It’s the same old story: vendor lock-in with just a different name or technology.

When we think about the cloud, the whole...

Raissa_T | 06 Jul 2011 | 0 comments

For some organizations, the idea of virtualizing their mission critical applications is perceived to be more riskier and its advantages are not yet compelling to make the switch.  In fact, according to Symantec's latest Virtualization and Evolution to the Cloud Survey, 40% of CEOs and 42% of CFOs are reluctant to make the leap to virtualization.  Some of the cited reservations are around downtime and the lack of visibility into virtualized environments.  However, IT users can get around these challenges by making sure that they deploy virtualization tools that are suited for their particular needs. A foolproof virtualization plan can deliver cost savings, efficiency, increased productivity, high availability, disaster recovery, etc.  Recently, Dan Lamorena, Product Marketing Director within Symantec's Storage and Availability Management Group authored an article in the ...

Sanjays | 23 Jun 2011 | 0 comments

A lot of us are bombarded with messages about the cloud and differening views and opinions on what is a cloud. I guess the National Institute of Standards and Technology  has weighed in on the topic to settle it once and for all.

Checkout their definition at : http://www.nist.gov/itl/cloud/upload/cloud-def-v15.pdf

You can also find more information about Symantec's view of the cloud at http://go.symantec.com/cloudavailability

Sanjays | 23 Jun 2011 | 0 comments

The explosive growth of data is one of the top challenges of every data center faces today. Keeping this in mind,  it becomes very important to understand the 'cloud' storage models which can help alleviate some of these problems while delivering storage with pre-defined SLAs.

This article does  a great job explaining Storage-as-a-Service,  what it is and what its not. Storage-as-a-Service gives you the benefits of having an internal service with automated, pooled storage within a private data center.

To understand Storage-as-a-Service read the blog:

http://chucksblog.emc.com/chucks_blog/2009/08/reconsidering-storage-as-a-service.html

To understand how you can implement your own Storage-as-a-Service architecture and the challenges read:

...

Sanjays | 16 Jun 2011 | 0 comments

Is the approach to the cloud similar to a new car purchase ? Dave Elliott, Symantec Product Marketing Manager argues that it is indeed similar, in his blog on SOAworld magazine.

 The crux of the argument is that you really need to have thought about what you are getting from the cloud, what will work for your needs as well as what is expected of it and then making sure that you are indeed making sure that it delivers.

For instance, for some smaller businesses it makes sense to use the public cloud but for many others, private cloud is the only way to go. Some enterprises have confidential data which needs to be very secure, others have requirements that mean they need control over the data. One has to consider the privacy, security and control they needs before they decide on how to approach the cloud.

In regards to car safety requirements, enterprises have similar needs for their data. Most enterprises cannot afford to lose their data or have their site...

RyanJancaitis | 14 Jun 2011 | 0 comments

 

Look back over the past 20 years and think about the size of drives you’ve purchased.  The first PC I owned ran Windows 95 and had a gigantic 512 MB hard-drive.  This machine was used primarily for writing papers, emacs based email, and ‘surfing the net’.  There was no way I could ever conceive of filling up my drive and thought I’d have the machine for years.  Then the default format for music and photos was digitized, and it quickly ran out of space.  When I bought my next PC, it had 8 times the storage, but came at a similar price.

That’s in the consumer space – where it gets interesting from the Enterprise Storage perspective.  When SANs were gaining popularity for their speed and capacities back in the early 2000’s, 10 TB SANs were enormous and only reserved for the largest and most IT aggressive customers.  In 2001, EMC Symmetrix 8000’s could be configured from 72 GB to...

Mike Reynolds PMM | 13 Jun 2011 | 2 comments

Hello all,

Interested in comparing performance characteristics of Storage Foundation to Solaris ZFS from the comfort of your own desk?  Check out this updated white paper at http://www.symantec.com/connect/articles/veritas-storage-foundation-and-sun-solaris-zfs-2011

Here is a brief introduction:

This whitepaper compares how Veritas Storage Foundation and Solaris ZFS perform for commercial workloads.
The paper describes not only the raw performance numbers, but also the system and storage resources
required to achieve those numbers. 

Pure performance numbers are interesting, but what is often ignored is how much CPU time, memory or I/O
to disk was required to reach that number. In a performance benchmark situation, the resources used are less
relevant, but for sizing server and storage infrastructure in...