Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.
Symantec Analyst Relations
Showing posts in English
Symantec Analyst Relations | 14 May 2012 | 0 comments

Cross-posted from Symantec Security Response blog

Join Symantec security experts on Twitter (using the #ISTR hashtag) on Tuesday, May 15, at 10 a.m. PT / 1 p.m. ET to chat about the key trends highlighted in Symantec’s recently released Internet Security Threat Report, Volume 17.

This year’s report, which covers the major threat trends observed by Symantec in 2011, highlights several troubling developments. For example:

  • Symantec blocked more than 5.5 billion malicious attacks in 2011, an increase of 81 percent over the previous year.
  • The number of unique malware variants increased to 403 million and the number of Web attacks blocked per day increased by 36 percent.
  • Targeted attacks are growing,...
Symantec Analyst Relations | 29 Apr 2012 | 0 comments

Symantec’s Internet Security Threat Report showcases the threat landscape in 2011, highlighting that the number of malware attacks increased by 81 percent. Read the report to get the latest on key trends including advanced targeted attacks that are expanding to focus on organizations of all sizes, increased data breaches, and a continued focus on mobile threats by attackers.

Additional resources

Multimedia:

nicolas_popp | 24 Apr 2012 | 0 comments

We can blame networking. IT security was never simple, even when computers were largely stand-alone machines connected to banks of green-screened terminals. However interfaces between systems were generally inaccessible to all but operations staff.

It was when we started to allow outside connections that things stepped up a level in terms of risks. First dial-in connections, then local area networking meant that access was available to anyone who had appropriate communications tools.

It was around this time that security professionals invented the term 'defence in depth'. The idea was that security existed as circles within circles, each layer protected at the boundary by appropriate technology.

Then the Internet came along and changed everything again. Without going through the entire history of the past two decades, we're now at a point where data can be in any one of several places – on corporate systems, on computers in home offices, on...

GregDay-SecurityCTO | 24 Apr 2012 | 0 comments

A conversation I sometimes get involved in with customers is, "How should we secure vSphere?" The environment doesn't have to be VMware-based of course, it could be Xen, Microsoft, Red Hat or any other, but the question remains. 

From a technical perspective, the set of risks is reasonably well understood and by and large appropriate mitigations exist. For example each virtual machine, and the network connections between VMs need to be as secure as their physical equivalents. Meanwhile security holes could exist in the hypervisor layer, as with any other software package. Protections such as defence in depth, intrusion detection and prevention, patch management and so on remain much the same as in the traditional, physical world.

However, the net-new of a virtualised environment lies in how VMs are provisioned and managed. It is clearly much easier to deploy a virtual machine...

c3lsius | 12 Apr 2012 | 0 comments

With the amount of talk about big data recently, you'd be forgiven for thinking that it was all a done deal. The truth is, however, that the can of worms has only just been opened. What started as a reasonably succinct view of the distributed information structures required to support highly scalable, event-driven online sites (think: Hadoop) has degenerated into a general discussion about the data explosion and what to do with all the islands of information that exist out there.

While purists might complain with the disintegration of the boundaries around big data, this may be no such bad thing. The fact is that companies large and small are struggling with the burgeoning size of all their data repositories, and who would not want to be able to use all available information as a business asset? 

The quantities of information organizations are dealing with today bring new challenges, from...

Jon C | 12 Apr 2012 | 0 comments

I recently had the opportunity to host a panel of lawyers, discussing cloud computing and its impact given the current state of legislation. I surmised three points from the debate:

  • Once contracts are in the mix, cloud very quickly becomes just another procurement mechanism. Whether it fits with how things have been done traditionally is irrelevant.
  • The hybrid cloud model is inevitable, indeed it is already here. However some behaviours remain inextricably linked to traditional, in-house models.
  • The complexities inherent in the hosted model mean that lawyers are not going to be about of jobs any time soon.

The conversation turned, inevitably given the panellists, to the risk factors which underpinned cloud computing, its procurement and operation. The fly in the ointment was the general acceptance that risk discussions have traditionally run on a parallel track to those about IT strategy or infrastructure delivery.

Such...

Jose Iglesias | 06 Apr 2012 | 0 comments

Isn't it funny how the human race can be so fickle? A few years ago, everybody - individuals, corporations, governments - was concerned about the future of the planet. To the extent that it coloured many discussions: "What's your green story?" was a pretty standard question for an industry analyst to ask, and public sector organisations were including sustainability criteria on their RFPs.

That was, of course, before the small matter of the global financial crisis, which understandably distracted attention from such altruistic aspirations. Current thinking suggests that we are happy to let our children's children worry about their own futures, while we concern ourselves with more pressing challenges such as keeping the business afloat, or putting food on the table.

Interestingly enough, the wave of attention about the planet's imminent collapse was preceded by a series of governance...

Marie Pettersson | 04 Apr 2012 | 0 comments

Consumerisation is nothing new. When personal computers first arrived (together with office and database software from companies like Lotus, WordPerfect and Microsoft) they enabled people with a bit of money to equip their home offices in much the same way as their workplaces. The key phrase here is, "with a bit of money," as the earliest adopters of home technology were frequently the more senior corporate staff. With a simple floppy disk drive providing the connection between home and work, executives were quickly impressing each other with their database prowess or skill in creating presentations. Roll forward a few decades and technology has become a lot more accessible and affordable.

These days we use the term 'consumerisation' to talk about smartphones and 'apps', use of online collaboration and storage, and indeed, having computers and printers at home that are often more powerful or functional than corporate-supplied kit. The...

D Thomson | 07 Mar 2012 | 0 comments

Over the years, I've seen a fair few maturity models applied to systems managementand IT service delivery, a while back “organic IT”, then “utility computing” and more recently to private cloud computing. In general they allaspire to reach “level 4” within the following model:

- 1 - Unstructured or chaotic - a free-for-all in which anything goes

- 2 - Structured - a basic handle on what's going on but still on the back foot

- 3 - Managed - things are properly under control and co-oordinated

- 4 - Dynamic - the kind of agile, responsive management all aspire to

Now I don't want to question such models, as they are generally pretty good. However, not many of the organisations I have visited have anything approaching level 4, or if they do, it is in a few isolated areas of the organisation. All the same, IT and business goes on so clearly they must be doing something right.

Perhaps, however, we...

GregDay-SecurityCTO | 05 Mar 2012 | 1 comment

Many of the security issues we see with desktops and laptops today can be explained by the fact that such end-point computing devices were never designed to be connected together. It was only with the arrival of affordable network cards, then operating systems such as OS/2 and Windows 3.11, that PCs could be connected to the corporate LAN.

Since then, we’ve seen wave after wave of security issues as first smart-Alec students, then malicious hackers, then commercially motivated practitioners of the dark arts devised increasingly complex attack vectors. From the earliest email-borne computer viruses to the kinds of breach we see today, each wave also caused a protective response from security companies.

While nobody would suggest switching off all the protections that are in place today, most would accept that things would happen differently if they could start from scratch today. PCs have become like a car with a thousand bumpers – while protected against...