Video Screencast Help
Protect Your POS Environment Against Retail Data Breaches. Learn More.
Storage & Clustering Community Blog
Showing posts tagged with Storage Foundation
Showing posts in English
Steve_Grimwood | 25 Mar 2013 | 0 comments

As in my first blog 'Thin Provisioning - How Symantec Can Help' we recognise the "data explosion" as fact rather
than fable and as before is set to continue for the foreseeable future.  Projected growth rates are inexact
however all are in agreement that data creation and retention is on the rise, placing further demand on an
already stretched IT storage team.

Again most customers I speak with are investigating some form of data/storage optimization. 
Often the largest spend of the IT budget is storage, and of course CSO’s and CTO being tasked with reducing this.
Solutions available vary from something as simple as  migrating data from tier 1 storage to adopting newer
technologies such as, SSD’s, Thin Provisioning, along with known technologies such as  compression, dedupe,
the list goes on….

Today’s blog will be targeting space utilization and more...

Steve_Grimwood | 25 Mar 2013 | 0 comments

Today we recognise the "data explosion" as fact rather than fable  (data explosion - the continued increase in the
amount of digital data created and stored) and is set to continue for the foreseeable future. 
Projected growth rates are inexact and vary between analysts, however all are in agreement that data creation and
retention is on the rise, placing further demand on an already stretched IT storage team.
 
With this in mind most customers I speak with are looking at some form of data/storage optimization. 
Typically the largest part of the IT budget is spent on storage, with CSO’s and CTO being tasked with reducing this.
Solutions available vary from something as simple as  migrating data from tier 1 storage to adopting newer
technologies such as, SSD’s, Thin Provisioning, along with known technologies such as  compression, dedupe,the
list goes on…....

Kimberley | 19 Mar 2013 | 0 comments

Although solid-state continues to be touted as a way to address storage bottlenecks for running performance-intensive applications, enterprises are struggling with how to implement the technology in an optimized manner. Though the benefits of solid-state are great, they cannot be fully realized or accomplished unless paired with enterprise-grade data management software in order to optimize their available capacity, and provide high levels of data protection and continuous application availability. This also helps to cost-effectively make use of available tiers of storage. Symantec’s Veritas Storage Foundation can help alleviate these issues because it increases visibility, is automated, efficiently stores data, and provides dynamic storage tiering. To learn more about how Symantec storage solutions can help your company or organization reap the benefits of solid-state technology, check out this article: http://bit.ly/Y4jiAR...

Corradino Milone | 18 Mar 2013 | 0 comments

 

Point in time copy is a technology that permit to make the copies of a large sets of data a common activity.

Like photographic snapshots capture images of physical action, point in time copy or snapshot are virtual or physical copies of data that capture the state of data set contents at a single instant. Both virtual (copy-on-write) and physical (full-copy) snapshots protect against corruption of data’s content. Additionally, full-copy snapshots can protect against physical destruction. These copies can be used for a backup, a checkpoint to restore the state of an application , data mining, test data and kind of off-host processing. An important peculiarity of these copies is that from the application point of view the copy seems to occur atomically. That means that all the data updates that happens on the original data are applied before or after the point in time copy.

The definition that the Storage Networking Industry Association (SNIA) gives...

bpascua | 14 Mar 2013 | 0 comments

Cloud computing has long been a buzz word in the IT sector. A seemingly fitting definition of cloud is “a model for enabling convenient, on demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort. Many organisations have cautiously been dipping their toes in the public cloud market. Many applications can be seamlessly delivered through public clouds which alleviate the need to on premise servers and reduce the capex costs involved with hosting these services on premise.  Public clouds offer the benefit of utilizing someone else’s  infrastructure to run IT workloads on a pay as you go basis thus reducing your capex costs. Private clouds are different since they leverage cloud technology and ultimately reduce overall opex costs and improve business agility. Many organisations particularly in the financial sector simply wont use public clouds for their mission critical...

Michael Black | 07 Mar 2013 | 0 comments

 

Hi, all.

We have released a set of articles that contain information about troubleshooting the "failed" disk status, as reported by vxdisk.

Here is the link:

"Failed" or "failed was" is reported by vxdisk
http://www.symantec.com/docs/TECH200618

Since this is a broad topic, the "technote" is actually a set of about a dozen article that have been organized into a logical tree structure, with TECH200618 at its "root."

Let us know what you think!
 

Regards,

Mike

dennis_wenk | 27 Feb 2013 | 0 comments

The modern organization is highly dependent on information technology, simultaneously and quite unintentionally, information technology has introduced new exposures which have deceptively seeped into every layer of the financial organization.  The likelihood that an organization will experience a catastrophic loss from an IT-service interruption caused by an IT issue is far greater than an interruption coming from some disaster or ‘black swan’ event.  Still, the key to survival is allocating the appropriate amount of resources to the “right” risks; while that may include planning contingencies for a worse-case scenario, to be rational about risk more guidance regarding the investment tradeoffs that mitigate risk.

The “Big Question” is how to optimize scarce resources today, to achieve the greatest reduction in future losses.  The Big Question two components: (1) which risks are the serious ones and...

c3lsius | 19 Feb 2013 | 0 comments

The Veritas Operations Manager team is launching a new version middle of this year, and we want you (current or new customers) to try out the beta and give us feedback. Check out the VOM and VOM Advanced 6.0 Technology Preview post on what new features you can expect and how to participate in the beta program.

Also, the VOM Tell Your Story Contest for current VOM or VOM Advanced customers has been extended to April 30, 2013. Share your experience about VOM or VOM Advanced to earn extra Symantec Connect points. Here are the contest details.

TonyGriffiths | 01 Feb 2013 | 0 comments

Veritas Storage Foundation and High Availability Solutions (SFHA) 6.0.3 is now available

For AIX , Solaris, Red Hat Enterprise Linux, SUSE Linux and HP-UX

SORT links are below:

 

Use SORT notifications to receive updates on new patches and documentation.

 

Cheers

Tony

 

 

 

 

 

Rank

Product

Release type

Patch name

Release date

1

Veritas Storage Foundation HA 6.0.1

Maintenance Release

...

bpascua | 23 Jan 2013 | 0 comments

When we talk about TRIM we aren’t referring to losing a few founds after Christmas, although I could certainly use that. TRIM refers to Solid State Drive and Flash technologies,  which are now becoming more prevalent in Data Centres as well as our consumer world. If you look to buy a laptop these days there is a strong case for opting for a solid state drive to give you faster boot up times and performance. Similarly there are many options in the Enterprise market from true flash arrays like Violin to the PCI accelerator cards like Fusion IO. In order to understand TRIM you need to have an idea of how Flash storage works. SSD’s use NAND memory to store and transfer information in pages.  A collection of pages  makes up a blocks. You cannot delete a page, you can only delete a chunk of pages (block) So when you delete a file it actually just gets marked for deletion and at a later time when enough pages are available they are deleted. This practice slows...