Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.
Netting Out NetBackup
Showing posts tagged with Deduplication
Showing posts in English
cdavitian | 18 Feb 2014 | 0 comments

What was your reaction when you got your first phone call on a smartphone? If you’re like me, it was something like: 

"Oh yeah, this makes sense. The caller’s name appears on the screen. My music stops playing. With one click, I can respond with a text message. That’s how a phone should work.”smartphones.jpg

If you’re like my colleague Joel Martins, your reaction was:

“Can I manage a NetBackup restore like this?” No joke.

Joel runs the development group for NetBackup. He’s as "NetBackup" as it gets – he started as a green engineer in the NetBackup 3.2 days and he even named his first child bprestore (joke…). More importantly, Joel was the first person to realize that we could apply the lessons of the iPhone-age to data protection.

Backup is the backbone of any datacenter – it maintains data integrity so that the application infrastructure can do its job. We...

AbdulRasheed | 03 Jul 2012 | 90 comments

NetBackup Accelerator is an exciting feature introduced in NetBackup 7.5 and NetBackup Appliances software version 2.5. This blog is not a substitute for NetBackup documentation. NetBackup Accelerator transforms the way organizations do backups, I am compiling a list of frequently asked questions from NetBackup Community forums and providing answers. If you have follow up questions or feedback, post them as comments. I shall try to answer the questions or someone else in the community can jump in. Fasten your seat belts! You are about to get accelerated! 

What is NetBackup Accelerator?

NetBackup Accelerator provides full backups for the cost of an incremental backup.

cost reduction in full backups = reduction in backup window, backup storage, client CPU, client memory, Client disk I/O, network bandwidth etc.

NetBackup Accelerator makes this possible by making use of a platform and file system independent track log...

Mayur Dewaikar | 07 Dec 2011 | 0 comments

If you are evaluating dedupe solutions, the dedupe ratios claimed by dedupe vendors are bound to intrigue you.  I have seen claims of dedupe rates as high as 50:1 and I am sure there are claims of even higher dedupe than that. Are such dedupe rates realistic? Speaking truthfully, yes, but one must understand the assumptions and the math behind such high dedupe rates.  These dedupe rates generally have the following assumptions:

  1. Logical Capacity: Logical capacity is the amount of data one “would have” stored with no dedupe or compression. So for example, if you are protecting 20 TB of data for 30 days and if you are running daily backups, your total data protected data (in theory) is 20 x 30= 600 TB. In practice, for an environment with average change rate, backend dedupe capacity is equal to the front end capacity for a 30 day retention period. So assuming 20 TB of dedupe storage is needed, your dedupe ratio is 600/20 = 30:1. While this makes...
AbdulRasheed | 30 Nov 2011 | 0 comments

Intel3.JPGWhen Lisa Graff, VP/GM of Intel’s Platform Engineering Group took the stage at a special event during SC ’11 Super Computing Conference in Seattle, I was not the only one who had wondered why the new launch was named EPSD 3.0, when there was no 2.0.  Within 10 minutes of her announcement speech, she articulated why it wasn’t just a 2.0!

Okay, what is EPSD? It stands for Intel Enterprise Platform and Services Division. This group designs and builds server board and related products for Channel and alliances. When Symantec, the world-leader in security and storage solutions, sought a partner to help deliver it's award winning backup software in an appliance form factor, it selected Intel EPSD for an enterprise class server board. The result can be found in the NetBackup 5220, a single-vendor, purpose-built, enterprise backup appliance from Symantec that...

Jed Gresham | 25 Aug 2011 | 1 comment

With the one-two punch of an earthquake in the Mid-Atlantic US followed closely by a hurricane potentially hitting the same region, Disaster Recovery is probably a popular discussion right now in the Washington DC area.  Of course Business Continuity professionals write contingency plans for all types of disasters, not just ones caused by nature.  Don’t you want to ask God, or Mother Nature, or the Flying Spaghetti Monster what else we should be planning for, and in what order or combination?  Do you think DC Business Continuity professionals have planned for things like enemy countries parachuting in armies of robotic killer crabs with pompadours and lasers?  What about zombies?  And is there a real difference between planning for zombies or natural disasters?  Well, that’s what I’d like to explore today; the nuances of disaster planning for the zombie apocalypse as it pertains to data protection.

I’d like you to treat...

AbdulRasheed | 15 Aug 2011 | 10 comments

  IDC defines(1) a purpose-built backup appliance as a disk-based solution that utilizes software, disk arrays, server engine(s), or nodes that are used for a target for backup data and specifically data coming from a backup application (e.g., NetBackup, Backup Exec, Networker etc.). These products are stand-alone disk systems purpose built to serve as a target for backup.

  Symantec’s NetBackup 5200 series appliances, while purpose built for backups, are much more than a traditional PBBA. For example, the all new NetBackup 5220 appliance can be deployed as a NetBackup master server, media server or both. This modular appliance can be used for implementing a brand new NetBackup domain for a remote office or enterprise without needing additional software based backup application to manage data protection for clients. NetBackup 5220 is not just a purpose built backup target,...

Mayur Dewaikar | 10 Aug 2011

In my current role at Symantec, I spend a lot of time talking to customers about their data protection strategies. It is interesting to note how much misinformation some of our competitors continue to give customers about Symantec’s deduplication technology.  They continue to scour older product manuals to find information that is inaccurate and continue to use it against Symantec to create FUD in the minds of customers. It has gotten so bad that I am going to recommend one of our competitors to change their tag line from “Where information lives” to “Where MIS-information lives”. smileyOK, jokes apart, I thought it would perhaps be worthwhile to blog about exactly how Symantec approaches deduplication so we can put an end to all this misinformation.

Deduplication has clearly come a long way in...

AbdulRasheed | 18 May 2011 | 0 comments

There are a number of deduplication solutions from various vendors available for you to choose from. There are plenty of discussions about various kinds of deduplication solutions and how those help to reduce the amount of secondary storage required for backups. In a series of blogs titled The power of NetBackup deduplication, we looked at various aspects of NetBackup deduplication and elaborated on how NetBackup provides a powerful, global, flexible and application aware deduplication technology which stands out from the crowd. The availability of appliance form factor for NetBackup deduplication made it a very popular turnkey solution for enterprise. However we didn’t really say anything about one of the most important decision factor when you are shopping for dedupe, how much does it really cost to implement and maintain a...

AbdulRasheed | 18 Apr 2011 | 0 comments

In the previous blog in this series (see links below) on Power of NetBackup Deduplication, we talked about two special powers of NetBackup deduplication, viz. how dedupe processing can be distributed and how backups are securely streamed. Now let us talk about two more exciting differentiators. 

Application aware deduplication

The technology in NetBackup Appliances for data reduction is NetBackup Deduplication.  Unlike third party vendor solutions where all backup streams are treated the same way in an effort to identify duplicate data with excessive processing overhead, NetBackup Deduplication understands the backup streams. NetBackup Deduplication uses the normalized stream to identify data type, detect file boundaries and does deduplication with less resource overhead. For example, a backup stream from a NetApp filer coming in ufsdump (NDMP backup) format is identified using a deduplication stream handler that can individually process the file objects...

AbdulRasheed | 13 Apr 2011 | 2 comments

Data deduplication is the most popular form of storage capacity optimization.  Deduplication makes it possible to store more on disk with less backend storage, hence it is a very promising method to eliminate or minimize tape as the backup medium.

The traditional deduplication appliances may reduce the storage required for backups, but it still does not address key issues in data protection for enterprise data centers.

  1. Shrinking backup windows: The data needs to be streamed to a backup server before it can be written to deduplication storage, hence the backup servers still need resources at the same or higher level as was the case before introducing the deduplication device.  As the production data size increases, the backup infrastructure would need to be upgraded or expanded to maintain the backup window.
  2. Flooded network infrastructure: The traditional deduplication appliances are typically end points in a...