Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.
Storage and Availability Management
Showing posts in English
Raissa_T | 06 Jul 2011 | 0 comments

For some organizations, the idea of virtualizing their mission critical applications is perceived to be more riskier and its advantages are not yet compelling to make the switch.  In fact, according to Symantec's latest Virtualization and Evolution to the Cloud Survey, 40% of CEOs and 42% of CFOs are reluctant to make the leap to virtualization.  Some of the cited reservations are around downtime and the lack of visibility into virtualized environments.  However, IT users can get around these challenges by making sure that they deploy virtualization tools that are suited for their particular needs. A foolproof virtualization plan can deliver cost savings, efficiency, increased productivity, high availability, disaster recovery, etc.  Recently, Dan Lamorena, Product Marketing Director within Symantec's Storage and Availability Management Group authored an article in the ...

Jennifer Ellard | 26 Jun 2011 | 0 comments

Today Symantec announced the latest release of ApplicationHA, a new solution that provides high availability for business critical applications through application-level visibility, control and recovery in VMware environments. We’re particularly excited about this new release as it addresses some key concerns shared by data center administrators as they virtualize their infrastructures.

Administrators are cautious with virtualizing their applications as their criticality grows. Symantec’s recent Virtualization and Evolution to the Cloud Survey concluded that in spite of executives’ concerns, more than 59 percent plan to virtualize database applications in the next 12 months. We see that tier 3 applications are virtualized more than tier 2 and tier 2 more than tier 1. There...

Sanjays | 23 Jun 2011 | 0 comments

A lot of us are bombarded with messages about the cloud and differening views and opinions on what is a cloud. I guess the National Institute of Standards and Technology  has weighed in on the topic to settle it once and for all.

Checkout their definition at : http://www.nist.gov/itl/cloud/upload/cloud-def-v15.pdf

You can also find more information about Symantec's view of the cloud at http://go.symantec.com/cloudavailability

Sanjays | 23 Jun 2011 | 0 comments

The explosive growth of data is one of the top challenges of every data center faces today. Keeping this in mind,  it becomes very important to understand the 'cloud' storage models which can help alleviate some of these problems while delivering storage with pre-defined SLAs.

This article does  a great job explaining Storage-as-a-Service,  what it is and what its not. Storage-as-a-Service gives you the benefits of having an internal service with automated, pooled storage within a private data center.

To understand Storage-as-a-Service read the blog:

http://chucksblog.emc.com/chucks_blog/2009/08/reconsidering-storage-as-a-service.html

To understand how you can implement your own Storage-as-a-Service architecture and the challenges read:

...

Sanjays | 17 Jun 2011 | 0 comments

Here's an interesting architecture around the emerging trends around storage and specifically around scale-out NAS solutions. This article does a good job explaining the needs for scale-out NAS solutions and their evolution into cloud based solutions.

The article mentions some potential solutions that include Symantec's FileStore product. FileStore is one of the fastest performing scale-out NAS solutions in the market. While VirtualStore is not mentioned in this article it is a completely software solution that can be used to gain the same benefits mentioned in the article while using existing storage hardware.

Check out the blog at: http://searchstorage.techtarget.com/feature/Scale-out-NAS-object-storage-cloud-gateways-replacing-file-storage?asrc=EM_USC_14145246&...

Mike Reynolds PMM | 16 Jun 2011 | 0 comments

If you are considering Virtual Desktop Infrastructure, checkout this blog from DCIG.  In the blog, Jerome Wendt discusses how SSDs and deduplication are key technologies for delivering a successful, large VDI deployment.     

Here is an excerpt from the article:

Deduplication and SSDs are becoming viewed as prerequisite technologies to ensure successful VDI deployments for two primary reasons.

  • There is a substantial amount of duplication of files when migrating desktops to VDI environments. This creates a need for a large amount of storage.
  • Booting a desktop requires reading from the disks on which its images reside. It is when multiple desktops are booted at the same time that contention occurs and performance is negatively impacted

Deduplication and SSDs can be used in combination to tackle these...

Sanjays | 16 Jun 2011 | 0 comments

Is the approach to the cloud similar to a new car purchase ? Dave Elliott, Symantec Product Marketing Manager argues that it is indeed similar, in his blog on SOAworld magazine.

 The crux of the argument is that you really need to have thought about what you are getting from the cloud, what will work for your needs as well as what is expected of it and then making sure that you are indeed making sure that it delivers.

For instance, for some smaller businesses it makes sense to use the public cloud but for many others, private cloud is the only way to go. Some enterprises have confidential data which needs to be very secure, others have requirements that mean they need control over the data. One has to consider the privacy, security and control they needs before they decide on how to approach the cloud.

In regards to car safety requirements, enterprises have similar needs for their data. Most enterprises cannot afford to lose their data or have their site...

c3lsius | 15 Jun 2011 | 0 comments

An interesting article written by Jerome Wendt of DCIG about Data Insight's new SharePoint support functionality:

Data Insight Extends Data Ownership Classification Capabilities to Microsoft SharePoint

Microsoft SharePoint is fast replacing network file servers as the preferred tool for information sharing and workplace collaboration within enterprises. But as that occurs, the same set of data management issues that exist on network file servers are re-surfacing in these environments. By Symantec now extending the capabilities of its Data Insight to reach into SharePoint, enterprises can be assured that they are only keeping the data that they need in SharePoint while confidently archiving, deleting or re-assigning the rest.

Sheila Childs, Research Director for Gartner's Storage Strategies and Technologies group,...

RyanJancaitis | 14 Jun 2011 | 0 comments

Look back over the past 20 years and think about the size of drives you’ve purchased.  The first PC I owned ran Windows 95 and had a gigantic 512 MB hard-drive.  This machine was used primarily for writing papers, emacs based email, and ‘surfing the net’.  There was no way I could ever conceive of filling up my drive and thought I’d have the machine for years.  Then the default format for music and photos was digitized, and it quickly ran out of space.  When I bought my next PC, it had 8 times the storage, but came at a similar price.

That’s in the consumer space – where it gets interesting from the Enterprise Storage perspective.  When SANs were gaining popularity for their speed and capacities back in the early 2000’s, 10 TB SANs were enormous and only reserved for the largest and most IT aggressive customers.  In 2001, EMC Symmetrix 8000’s could be configured from 72 GB to “nearly 70...

Mike Reynolds PMM | 13 Jun 2011 | 2 comments

Hello all,

Interested in comparing performance characteristics of Storage Foundation to Solaris ZFS from the comfort of your own desk?  Check out this updated white paper at http://www.symantec.com/connect/articles/veritas-storage-foundation-and-sun-solaris-zfs-2011

Here is a brief introduction:

This whitepaper compares how Veritas Storage Foundation and Solaris ZFS perform for commercial workloads.
The paper describes not only the raw performance numbers, but also the system and storage resources
required to achieve those numbers. 

Pure performance numbers are interesting, but what is often ignored is how much CPU time, memory or I/O
to disk was required to reach that number. In a performance benchmark situation, the resources used are less
relevant, but for sizing server and storage infrastructure in...