Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.
Security Community Blog
Showing posts tagged with Critical System Protection
Showing posts in English
A L Johnson | 08 Apr 2014 | 3 comments

Symantec launched its 2014 Internet Security Threat Report (ISTR), Volume 19, which highlights how cybercriminals unleashed the most damaging series of cyberattacks in history – ushering in the era of the “Mega Breach.” Please visit the ISTR landing page for this year’s report and supplemental assets.

 

SebastianZ | 11 Feb 2014 | 0 comments

Microsoft Security Bulletin

On Tuesday the 11th of February Microsoft released the monthly Security Bulletin Summary for February 2014. The summary includes 7 Security Bulletins - 4 are classified as critical; 3 as important:

 

  • MS14-010    Cumulative Security Update for Internet Explorer (2909921)

Vulnerability impact: Critical - Remote Code Execution
Affected Software:
Microsoft Windows, Internet ExplorerSumamry

  • MS14-011    Vulnerability in VBScript Scripting Engine Could Allow Remote Code Execution (2928390)

Vulnerability impact: Critical - Remote Code Execution
Affected Software: Microsoft Windows

  • MS14-007    Vulnerability in Direct2D Could Allow Remote Code Execution...
Brandon Noble | 30 Dec 2013 | 2 comments

I guess we need to face it. Sality is here to stay.

We have been dealing with new Sality variants for more than 8 years and the Sality.AE family for a little over 5…the variants keep coming. It has become one of the most common file infectors reported by Enterprise customers. With its ability to move through shares and disable AV, it’s one of the most destructive and tricky threats we have out there. That said, it’s not too hard to stop, provided you have two things. The first is an understanding of how it spreads and infects, the second is a willingness to mount the proper defense while you seek out the hidden pockets of this threat and eradicate it.

So, first things first. How does it spread?

This is a file infector and it can only spread through shares. Its uses two methods, I refer to as a “Push” and a “Pull” to infect. Managing these attacks will keep the threat from spreading to more computers.

 

...

captain jack sparrow | 03 Dec 2013 | 0 comments

can transmit information between computers using high-frequency sound waves inaudible to the human ear. The duo successfully sent passwords and more between non-networked Lenovo T400 laptops via the notebooks’ built-in microphones and speakers. Freaky-deaky!
The infected victim sends all recorded keystrokes to the covert acoustical mesh network. Infected drones forward the keystroke information inside the covert network till the attacker is reached.

ref:
http://www.pcworld.com/article/2068525/researchers...

darci_hunt | 14 Aug 2013 | 0 comments

Today, nearly all of an agency’s mission-critical functions depend on safe and secure information technology systems. With cyber threats ever evolving and growing at an exponential rate, and increased reliance on technology to deliver core services in government, a robust cyber defense is needed by agencies.

Continuous Monitoring is certainly not a new term, but if you were to ask 10 people how they would define this term, you’re likely to get 10 different responses. Ken Durbin, Cyber & Continuous Monitoring Practice Manager, Symantec, provided expert insights on Symantec’s view of Continuous Monitoring and how agencies are adopting continuous monitoring programs as a means to protect government data and infrastructure. Durbin also highlights the benefits, best practices and challenges to adopting a continuous monitoring program.

Continuous monitoring is one part of a six-step process in the NIST Risk Management Framework (RMF), from NIST...

MFox70 | 23 Jul 2013 | 0 comments

I attended a webinar recently which was talking about the move from physical to virtual servers in large corporations. The analogy used was that today, approximately 70% of all servers can be virtualised very quickly, but the remaining 30% can take several years of effort. Hypervisor vendors are working hard to sort this problem out, but the interesting finding was that a large section of that problematic 30% of servers are running legacy applications or are indeed legacy operating systems.

This is odd as you would think that any IT operations person would want to migrate a legacy server from physical to virtual hardware as soon as humanly possible.

 

Legacy systems are still around for a few reasons.

1 Laziness

2 Applications cannot be modified to work on newer OS platforms

3 Software Developers have long since left the company ( relates to point 2)

4 Legacy systems are connected to business critical servers, with little or no...

MFox70 | 26 Jun 2013 | 0 comments

Patching.

 

It’s a painful topic for most IT professionals, seen as the eternal battle between keeping a system running, functionally up to date yet ensuring it is secure.

Some organisations I talk to have a monthly patching cycle, which takes a week out of every month to complete. Yes, 3 months of the year, they have teams of staff patching applications and servers. This is a costly and time consuming process, and I am sure these engineers would rather be doing something more interesting!

 

Yet it is arguable if a fully patched system really IS secure. Many hacking attempts and malware writers look for vulnerabilities that are not even discovered by the software vendors, a concept known as a Zero Day threat, so having a system that is patched against “ yesterdays’ “ threats is not exactly ideal. Let’s face it, malware writers and hackers create exploits quicker than corporates patch their systems.

...

James Hanlon | 10 Jun 2013 | 1 comment

You must have been taking a long (and probably well deserved) holiday if you have not noticed the increasing use of the term “cyber” in the press recently.

Anything security related is now a cyber risk, a cyber incident or a cyber attack. Governments are driving cyber strategies, citizens need to be cyber aware, businesses are tabling cyber projects, companies are building cyber capabilities, vendors are creating cyber solutions and consultancies are creating cyber practices to help you enhance your cyber resilience.

With all this hype, the key question is - what is different from the infrastructure and information security we have been doing for years and this new cyber approach? This is a good question because everyone seems to have a different perspective on cyber. And for very good reasons.

At Symantec, we get the opportunity to discuss the different interpretations of cyber with many types of users and businesses – consumers, small and...

MFox70 | 31 May 2013 | 1 comment

Does your customer have a requirement for monitoring servers or for Intrusion Detection? Are they asking about Real-time File Integrity Monitoring (FIM)? Have they recently failed an IT compliance or regulatory audit?

 

Usually a request to monitor server activity, or user and administrative access to a server, is driven by a few business needs.

It could be a Compliance or Audit requirement, it could be to pass information to a Security Incident and Event Management tool (SIEM) or Security Operations Centre (SOC) team, but more typically it is deemed to be good IT behaviour to keep an eye on how your servers are being used on a daily basis.

 

Let’s think about the rationale for those points.

Firstly if you are being audited, or someone in a risk and compliance role is scrutinising your environment, the process of generating incidents which are then analysed and potentially acted upon is actually the housekeeping role that...

MFox70 | 01 May 2013 | 1 comment

Whitelisting has been a buzzword used in the industry for the past 18 months or so, and is seen by some as a Panacea to beat Malware spreading within organisations and control threats inside your environment. Indeed, some of Symantec’s products use Whitelisting as an additional method of controlling software behaviour and limiting the applications that employees can or cannot use.

 

Whitelisting generally involves a process of learning exactly which applications, operating system components and hardware drivers are installed on a server or workstation, collating that information centrally, and then allowing an administrator to approve or deny the use of these tools.

Once this process has initially completed, enforcement of this list of applications is then applied to the target machines. Theoretically, this has given control back to the organisation in relation to what software is allowed to run on the corporate computers.

 ...