Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.
Netting Out NetBackup
Showing posts tagged with NetBackup PureDisk
Showing posts in English
Kyle Drake | 12 Apr 2013 | 0 comments

Enterprises have been going through numerous transformations over the years to save on storage costs, energy costs, administrative overhead, mistakes…anything that lowers their risk and responsibility as well as overall CapEx and OpEx.  Many are now looking to offload some of their data center responsibilities to service organizations, while others are moving not just some, but ALL of it to the cloud. 

If you’ve thought about moving your data to the cloud, you’ve probably asked yourself the questions " How will I get my data back?” and “Will I be able to protect my applications in the cloud?”. NetBackup can help.  Why should you think about NetBackup and the cloud?

  • Industry trends show cloud as a popular emerging technology that many are adopting for reasons mentioned above
  • NetBackup’s platform approach for cloud can help solve your fundamental data protection challenges
  • NetBackup...
Kyle Drake | 12 Apr 2013 | 0 comments

Big Data is an emerging, evolving technology. It’s the new thing holding the promise to help make sense of the terabytes, petabytes, and exabytes of data being generated. In today’s 24 hours a day, seven days a week information driven economy, how can they quickly find and extract those key strategic nuggets of data to make their business more agile, make better business decisions, and give them that next competitive advantage?

Big Data today, as expressed by many, is simply the daily challenge virtually every enterprise IT organization faces when managing the protection of exploding data growth within shrinking backup windows, growing compliance requirements, and working hard to transform their data centers. Whether it’s growing multi-terabyte databases, data warehouse appliances bursting at the seams, or simply 100s of millions, or billions of files that need both fast protection and recovery, big data is Big

Data and in its many forms and...

Mayur Dewaikar | 27 Jun 2012 | 0 comments

Data growth remains one of the top pain points for backup admins.  The legacy data protection technologies are not keeping up with the data growth.  To address this problem, many organizations are re-architecting their backups. One of the key elements of backup redesign continues to be data deduplication.

Target based dedupe solutions were the early entrants in the dedupe market and a large number of users adopted these technologies over the years. While the target based dedupe solutions do a great job of reducing the data, they do so at the end of the backup cycle. The data essentially travels in a fully hydrated format (non-dedupe) from the clients, through the media server, to the target device where it finally gets reduced. You get great dedupe rates, which solves one problem—storage costs. But wait, how is this solving the backup window problem? Data will continue to grow and if you don’t address the backup volume problem (the sheer volume...

Peter_E | 04 Jan 2012 | 5 comments

Could you obliterate your backup window problems with 100x faster backups?  What if your car company called you up and told you that with a software upgrade you could make your car accelerate 100x faster.  What if the county or province where you live told you that your daily trip to work or the grocery store would be 100x faster in the coming months.  A new feature in the next release of NetBackup is expected to deliver just this type of massive leap in performance. 

Symantec first gave a hint about this feature, which will be called NetBackup Accelerator, back at our US Vision conference in 2011 (read the press release here), where we announced our intention to break the backup window and provide customers with a plan to modernize data protection....

Phil Wandrei | 18 Nov 2011 | 2 comments

In the data protection world, a number we frequently see and hear are deduplication rates. We hear of dedupe rates ranging from 50:1, 20:1, to 10:1. Recently, I heard someone say that 50:1 is 5 times better than 10:1.  Their fuzzy math made me cringe, and I knew it was time to address this.       

To clarify deduplication rates, we need to examine: 1) the factors that influence deduplication rates and 2) the math. 

Deduplication Factors

Deduplication rates are like automobile miles per gallon (mpg):  Your Results Will Vary. The factors that affect deduplication results are:

  • Types of data (unstructured versus structured data) 
  • Change rate of data (what percent of data changes)
  • Frequency and type of backup (how often are you backing up the data? (i.e. daily, weekly, fulls or incremental)
  • Retention (how long are you keeping the dedupe data)
Jed Gresham | 25 Aug 2011 | 1 comment

With the one-two punch of an earthquake in the Mid-Atlantic US followed closely by a hurricane potentially hitting the same region, Disaster Recovery is probably a popular discussion right now in the Washington DC area.  Of course Business Continuity professionals write contingency plans for all types of disasters, not just ones caused by nature.  Don’t you want to ask God, or Mother Nature, or the Flying Spaghetti Monster what else we should be planning for, and in what order or combination?  Do you think DC Business Continuity professionals have planned for things like enemy countries parachuting in armies of robotic killer crabs with pompadours and lasers?  What about zombies?  And is there a real difference between planning for zombies or natural disasters?  Well, that’s what I’d like to explore today; the nuances of disaster planning for the zombie apocalypse as it pertains to data protection.

I’d like you to treat...

Mayur Dewaikar | 10 Aug 2011

In my current role at Symantec, I spend a lot of time talking to customers about their data protection strategies. It is interesting to note how much misinformation some of our competitors continue to give customers about Symantec’s deduplication technology.  They continue to scour older product manuals to find information that is inaccurate and continue to use it against Symantec to create FUD in the minds of customers. It has gotten so bad that I am going to recommend one of our competitors to change their tag line from “Where information lives” to “Where MIS-information lives”. smileyOK, jokes apart, I thought it would perhaps be worthwhile to blog about exactly how Symantec approaches deduplication so we can put an end to all this misinformation.

Deduplication has clearly come a long way in...

Mayur Dewaikar | 19 May 2011 | 0 comments

Having personally witnessed the evolution of data deduplication technologies over the years, I can say that deduplication has come a long way both in terms of maturity of vendor offerings, and end user sophistication. Until about two years back, I used to start my discussions with customers by explaining what is deduplication, what it can do for them, how the deduplication algorithm works etc. Fast forward to 2011, and I rarely have to touch upon these topics. Most users now have a fairly good understanding of the basics of deduplication. What I still find missing, however,  is a deeper understanding of how the power of deduplication can be leveraged in multiple places to solve data protection problems. As an example, many users still swear by virtual tape libraries or target based deduplication approaches for implementing deduplication.  Some users have been sold on the concept of deduplicating data at the source using backup clients. Both of these methods work well in...

AbdulRasheed | 26 Apr 2011 | 0 comments

In the last two blogs (see the links below) we covered the cornerstone pillars that make NetBackup Deduplication stand out in the crowded deduplication solutions market.  Let us conclude this series with a few implementation power points. 

Flexibility to choose software or appliance solution

NetBackup Deduplication can be deployed as a software solution on commodity hardware and storage or it can deployed quickly using NetBackup appliances. The software solution provides the capability to design your own deduplication solution from scratch. The appliances provide turnkey solutions for both enterprise data center and remote offices. The deduplication and other NetBackup features can be deployed in a matter of minutes using appliances.

Flexibility to choose inline or post-process deduplication

There are many vendors who can provide an appliance solution for deduplication. The backup server sends the data to the appliance; the data may be...

AbdulRasheed | 18 Apr 2011 | 0 comments

In the previous blog in this series (see links below) on Power of NetBackup Deduplication, we talked about two special powers of NetBackup deduplication, viz. how dedupe processing can be distributed and how backups are securely streamed. Now let us talk about two more exciting differentiators. 

Application aware deduplication

The technology in NetBackup Appliances for data reduction is NetBackup Deduplication.  Unlike third party vendor solutions where all backup streams are treated the same way in an effort to identify duplicate data with excessive processing overhead, NetBackup Deduplication understands the backup streams. NetBackup Deduplication uses the normalized stream to identify data type, detect file boundaries and does deduplication with less resource overhead. For example, a backup stream from a NetApp filer coming in ufsdump (NDMP backup) format is identified using a deduplication stream handler that can individually process the file objects...