Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.
Storage & Clustering Community Blog
Showing posts tagged with Storage Foundation
Showing posts in English
Doreet | 23 Oct 2012 | 0 comments

Taneja says "Symantec Veritas Cluster Server Takes Vrtual Server HA to the Next Level"

This is an awesome testimonial from Jeff Byrnes, an analyst from Taneja Group, who provides expert analysis and consulting services for the computer storage and server industries.

“All in all, we think the latest Veritas Cluster Server for VMware release adds significant value to customers running business-critical applications in virtualized environments.”

Doreet | 22 Oct 2012 | 1 comment

ESG Labs just published a blog on the ESG website discussing Storage Foundation performance and capabilities on Linux as contrasted to the native tools.

This is AWESOME!!! Check it out!

http://www.esg-global.com/blogs/come-one-come-all-to-the-greatest-show-on-earth/

S_D | 22 Oct 2012 | 0 comments

The Enterprise Strategy Group (ESG) calls Veritas Storage Foundation the tool that makes your IT data centers the greatest show on earth. And rightly so - there's not much else out there that makes performance problems disappear, eases your storage woes, and gives you the most bang for your current buck. No more late nights and hours spent agonizing on how to explain to management about downtime issues faced by your customers and the uncontrolled downward spiral of the stock price.

Veritas Storage Foundation from Symantec is the behind the scenes tool that makes all the magic happen. It is the foundation that keeps your data center running smoothly, and ESG is calling one and all with front row seats to watch the show: Veritas Storage Foundation from Symantec Performance and Efficiency advantages for Linux.

Performance, Ease of Management and...

Bala Kumaresan | 21 Oct 2012 | 0 comments

We are working on "SmartIO" feature for the next release of Storage Foundation, which can save customers a ton of money.

How do customers save a ton of money with SF "SmartIO" feature?

SF “SmartIO” feature results in the following,

  1. Increases Server utilization by IO bottleneck elimination
  2. Support for low latency and high throughout IO, by using flash storage present in the server
  3. Reduction in the IO load on storage controllers, as most of the application IO needs is served by “SmartIO” layer.

As a result, the number of servers required for same application throughput is reduced, storage commoditization happens because there is no need for high end storage controllers and the number of storage controllers required is also reduced as IO needs from storage array is much lower. All of the above saves cost for SF customers - Savings on both Capital expenses and Operational expenses....

bpascua | 14 Oct 2012 | 0 comments

This week Telefonica the owner of O2 have controversially announced they will start selling their O2 mobile customer data to third parties. Through Telefonica's Dynamic Insights business unit they are looking to drive revenue out of mining data to sell to third parties. They will take their phone network data and anonymise  and aggregate it. The product will be called Smart Steps and will be sold to public sector companies to enable them to measure compare and understand what factors influence the number of people visiting a location. The aim is so that retailers can then make informed decisions on things like locations and opening times for shops. The big question is around privacy in the use of their customer data. People reacted very strongly when Google started using customer data for their targeted advertising and this seems to be in the same vein. As a mobile phone user I am not sure how happy I am effectively having my movements traced even though I understand this...

dennis_wenk | 05 Oct 2012 | 0 comments

Basel II Accord for International Banking Operational Risk is defined as, “Risk of loss from inadequate or failed internal; processes, people, and systems or external events “.   When Processes, People or Systems fail, whether it be from internal or external events, the losses can be substantial.  As an example, the Ponemon Institute estimates that worldwide organizational are losing over $35 Billion monthly from data center downtime.  Nicholas G. Carr point out in his seminal Harvard Business Review article IT Doesn’t Matter, “today, an IT disruption can paralyze a company’s ability to make products, deliver its services, and connect with its customers, not to mention foul its reputation … even a brief disruption in availability of technology can be devastating.”

There are two primary ways for an organization to increase value.  The first way is to...

dennis_wenk | 04 Oct 2012 | 1 comment

“Best Practices” is a popular expression of the intent to manage business continuity prudently.  Best Practices are seen as a way to sidestep both the quantification of operational-risks, as well as, the objective evaluation of the cost-benefit for any proposed mitigation actions.  There are several reasons why Best Practices “Are not.” best for Business Continuity purposes.

  • It is unreasonable to assume that a best practice could optimally answer the business continuity questions for multiple organizations.  Organizations differ widely in terms of their maturity level, their technologies deployed, and their vulnerabilities. 
  • Given the wide assortment of published ‘best practices’, which of the best practices really are the ‘best’ for any particular circumstance? 
  • No organization could hope to implement all of the thousands of best practices to get it perfectly-right, and there is...
dennis_wenk | 04 Oct 2012 | 0 comments

Virtualization is the creation of a virtual (rather than actual) version of something, such as an operating system, a server, a storage device or network resources. Virtualization is a computing technology that enables a single user to access multiple physical devices. This paradigm manifests itself as a single computer controlling multiple machines, or one operating system utilizing multiple computers to analyze a database. Virtualization is about creating an information technology infrastructure that leverages networking and shared physical IT assets to reduce or eliminate the need for physical computing devices dedicated to specialized tasks or systems.

Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Through cloud computing, a world-class data center service and collocation provider offers managed IT services through a hosted or "Software...

Andy Spry | 04 Oct 2012 | 1 comment

"Is Oracle putting Oracle Solaris Cluster on an EOL trajectory?"

What a question to begin a blog! So what’s happening? What has Oracle done? Oracle has made significant changes to the purchase and support of Oracle Solaris Cluster:

First, Oracle is now charging for Sun Cluster (with no maintenance discounting) at $3,000 per CPU core! (Source: https://shop.oracle.com/pls/ostore/f?p=700:6:0::::P6_LPI:114206479116231741572457). For example, the perpetual licence cost for Oracle Solaris Cluster on a two-CPU Solaris T-4 system with 32 cores, (that’s 2*32*$3,000) is $192,000 at list price! A very significant cost, even after discount, for a product that used to be free!

Second, Oracle has dropped support for Symantec's Storage Foundation with Oracle Solaris...

TonyGriffiths | 03 Oct 2012 | 0 comments

SFHA 5.1 Service Pack 1 Rolling Patch 3 (Unix/linux) available.

The patch can be downloaded from the Symantec Operations Readiness Tools (SORT) portal.

Sign up to SORT to receive notifications on new patches, documentation, agents and ASLs

 

Rank

Product

Release type

Patch name

Release date

1

Veritas Storage Foundation HA 5.1SP1PR3

Rolling Patch

sfha-sol_x64-5.1SP1PR3RP3

2012-10-02

2

Veritas Storage Foundation HA 5.1SP1

Rolling...