Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.
Netting Out NetBackup
Showing posts tagged with NetBackup
Showing posts in English
SeanRegan | 26 Jan 2012 | 0 comments

Backups have become a big and burdensome operation for many backup admins. SLAs are getting tighter while information grows and new platforms like virtualization create higher density environments. With these forces in play the current approach to backup modernization is not effective. Today “backup modernization” is championed by vendors who champion solutions that address one or two aspects of backup – such as deduplication, snapshots, or tools for backing up just VMware and Hyper-V environments.  These are quick fixes, not modern data protection. Throwing more solutions at a problem as a quick fix is the cause of backup complexity and cost.

Backup and recovery is a crucial step in protecting an organization’s information and its ability to stay in business if something goes wrong.   A new approach is needed. To determine current trends, Symantec commissioned a global survey of enterprises...

Kristine Mitchell | 20 Jan 2012 | 1 comment

Forget the itchy  socks, the sweater I will never wear, and the crock pot (nice try…but still not getting me to cook). You can imagine my excitement when I opened this Christmas gift – a vintage typewriter keyboard for my iPad. If you are like me, you love your iPad, but still struggle with the keyboard. You also love cool and unusual gifts. But this got me thinking…what is it about human nature that digresses back to old practices? You know what I’m talking about – 2 steps forward, one step back. This was so obvious to me when I spoke to a customer the other day.  They have a manual, “old school” process in place to protect their virtual machines (VMs). In fact, they have one full time person who does nothing but map VM datastores to backup policies. Now here’s a cool technology like virtualization and...

Danny Milrad | 16 Jan 2012 | 0 comments

Never underestimate the bandwidth of a station wagon loaded with backup tapes.  This was thrown on the table during a recent customer meeting in the context of getting data offsite and onto disaster recovery sites.  What a great visual I thought to myself. The customer continued, FedEx is an amazing network they have high bandwidth but also high latency.  They move millions of packages every day…phenomenal bandwidth.  But in the always-on economy 24 hours to ship a backup tape is the epitome of high latency.

I talk to customers regularly about their regular backup rituals and disaster recovery plans. Like clockwork, FedEx (or other overnight carrier) comes up as the preferred network transport to ship tapes to the salt mines.  But I had to ask myself is putting your company’s most valuable data on trucks the best option.  If it’s getting backed up and sent offsite, it has to have at least some value, right?  While I...

Alex Sakaguchi | 16 Jan 2012 | 0 comments

All bark and no bite.  Heard the saying?  It essentially means one who talks a tough story, but then shuns away when asked to step up.

In 2010, Symantec conducted a survey of more than 1,600 senior IT and legal executives in 26 countries to determine the best – and worst – practices in the area of information management and published the results in the 2010 Information Management Health Check Survey.

87% of these folks said that enterprises should have a proper information retention strategy that allows them to delete unnecessary information.

Why, then, do less than 46% actually have one?

What’s the deal?  Why is it so hard to delete information that isn’t needed? 

Well, that’s the key isn’t it? ...

Randy Serafini | 09 Jan 2012 | 3 comments

How many of you have made a New Year’s resolution to lose weight?  Well OK, maybe not weight literally but you’ve been tasked to find ways in 2012 to reduce cost in your backup infrastructure and with a bit of smart maneuvering actually improve backup and recovery performance.  In the upcoming release of NetBackup, Symantec can help reduce cost and accelerate recovery with the integration of backup and snapshot replication management with the new NetBackup Replication Director.

In most Enterprise backup environments, both backup software and array based snapshot technology co-exist to provide tiers of protection and recovery.  The problem is that in many cases, the backup infrastructure is managed by the Backup Team, and snapshots and replicas are managed by the Storage Team.  While this may ‘appear’ to be efficient when things are good, when things turn bad and a fast recovery is required from either a major or minor disaster, it...

Peter_E | 04 Jan 2012 | 5 comments

Could you obliterate your backup window problems with 100x faster backups?  What if your car company called you up and told you that with a software upgrade you could make your car accelerate 100x faster.  What if the county or province where you live told you that your daily trip to work or the grocery store would be 100x faster in the coming months.  A new feature in the next release of NetBackup is expected to deliver just this type of massive leap in performance. 

Symantec first gave a hint about this feature, which will be called NetBackup Accelerator, back at our US Vision conference in 2011 (read the press release here), where we announced our intention to break the backup window and provide customers with a plan to modernize data protection....

CRZ | 19 Dec 2011 | 5 comments

I'm very pleased to announce that a new Maintenance Release for NetBackup 7.1 is now available!

NetBackup 7.1.0.3 is the third maintenance release on NetBackup 7.1.  This release contains several new proliferations as listed below:

  • Support for vSphere5
  • Support for SharePoint 2010 SP1 and Exchange 2010 SP2
  • Client support for Mac 10.7
  • Master and media support for AIX 7.1
  • NBSL changes to gather hardware information from appliance media servers attached to NBU7.1.x master servers

Along with above mentioned proliferations, several customer issues and internal engineering defects were fixed covering:

  • Resolution of Deduplication issues around data inconsistency, stream handler, GRT and high memory consumption during backups
  • Resolution of performance issues experienced by customers in BMR pre-restore environment since 7.0.X
  • Restore related issues in BMRon windows and HP ‘G...
Mayur Dewaikar | 07 Dec 2011 | 0 comments

If you are evaluating dedupe solutions, the dedupe ratios claimed by dedupe vendors are bound to intrigue you.  I have seen claims of dedupe rates as high as 50:1 and I am sure there are claims of even higher dedupe than that. Are such dedupe rates realistic? Speaking truthfully, yes, but one must understand the assumptions and the math behind such high dedupe rates.  These dedupe rates generally have the following assumptions:

  1. Logical Capacity: Logical capacity is the amount of data one “would have” stored with no dedupe or compression. So for example, if you are protecting 20 TB of data for 30 days and if you are running daily backups, your total data protected data (in theory) is 20 x 30= 600 TB. In practice, for an environment with average change rate, backend dedupe capacity is equal to the front end capacity for a 30 day retention period. So assuming 20 TB of dedupe storage is needed, your dedupe ratio is 600/20 = 30:1. While this makes...
AbdulRasheed | 30 Nov 2011 | 0 comments

Intel3.JPGWhen Lisa Graff, VP/GM of Intel’s Platform Engineering Group took the stage at a special event during SC ’11 Super Computing Conference in Seattle, I was not the only one who had wondered why the new launch was named EPSD 3.0, when there was no 2.0.  Within 10 minutes of her announcement speech, she articulated why it wasn’t just a 2.0!

Okay, what is EPSD? It stands for Intel Enterprise Platform and Services Division. This group designs and builds server board and related products for Channel and alliances. When Symantec, the world-leader in security and storage solutions, sought a partner to help deliver it's award winning backup software in an appliance form factor, it selected Intel EPSD for an enterprise class server board. The result can be found in the NetBackup 5220, a single-vendor, purpose-built, enterprise backup appliance from Symantec that...

SeanRegan | 29 Nov 2011 | 1 comment

If there is one key challenge for the virtualization team, it is backup.  All of that newfound agility that makes the virtual machine (VM) teams ninja-like in their ability to deliver IT as a service comes with a backend challenge.  As more and more mission critical applications and systems go virtual, how can these teams make sure they can deliver the same or better SLAs for backup?  Virtualized systems and data are not second class workloads anymore, they are prime time.  And lest you think virtualization is only a big company phenomenon – think again.  Small and mid-sized companies are adopting server virtualization technology at a faster pace than their bigger counterparts.  So the issue of protecting important data in virtualized environments is touching your neighborhood firms as much as big name businesses.

Vendor Landscape – Proceed with Caution

It’s no secret that the virtualization...