Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.
Security Community Blog
Showing posts tagged with Control Compliance Suite
Showing posts in English
darci_hunt | 14 Aug 2013 | 0 comments

Today, nearly all of an agency’s mission-critical functions depend on safe and secure information technology systems. With cyber threats ever evolving and growing at an exponential rate, and increased reliance on technology to deliver core services in government, a robust cyber defense is needed by agencies.

Continuous Monitoring is certainly not a new term, but if you were to ask 10 people how they would define this term, you’re likely to get 10 different responses. Ken Durbin, Cyber & Continuous Monitoring Practice Manager, Symantec, provided expert insights on Symantec’s view of Continuous Monitoring and how agencies are adopting continuous monitoring programs as a means to protect government data and infrastructure. Durbin also highlights the benefits, best practices and challenges to adopting a continuous monitoring program.

Continuous monitoring is one part of a six-step process in the NIST Risk Management Framework (RMF), from NIST...

MFox70 | 23 Jul 2013 | 0 comments

I attended a webinar recently which was talking about the move from physical to virtual servers in large corporations. The analogy used was that today, approximately 70% of all servers can be virtualised very quickly, but the remaining 30% can take several years of effort. Hypervisor vendors are working hard to sort this problem out, but the interesting finding was that a large section of that problematic 30% of servers are running legacy applications or are indeed legacy operating systems.

This is odd as you would think that any IT operations person would want to migrate a legacy server from physical to virtual hardware as soon as humanly possible.

Legacy systems are still around for a few reasons.

1 Laziness

2 Applications cannot be modified to work on newer OS platforms

3 Software Developers have long since left the company ( relates to point 2)

4 Legacy systems are connected to business critical servers, with little or no downtime...

MFox70 | 26 Jun 2013 | 0 comments

Patching.

It’s a painful topic for most IT professionals, seen as the eternal battle between keeping a system running, functionally up to date yet ensuring it is secure.

Some organisations I talk to have a monthly patching cycle, which takes a week out of every month to complete. Yes, 3 months of the year, they have teams of staff patching applications and servers. This is a costly and time consuming process, and I am sure these engineers would rather be doing something more interesting!

Yet it is arguable if a fully patched system really IS secure. Many hacking attempts and malware writers look for vulnerabilities that are not even discovered by the software vendors, a concept known as a Zero Day threat, so having a system that is patched against “ yesterdays’ “ threats is not exactly ideal. Let’s face it, malware writers and hackers create exploits quicker than corporates patch their systems.

So which other mechanisms...

James Hanlon | 10 Jun 2013 | 1 comment

You must have been taking a long (and probably well deserved) holiday if you have not noticed the increasing use of the term “cyber” in the press recently.

Anything security related is now a cyber risk, a cyber incident or a cyber attack. Governments are driving cyber strategies, citizens need to be cyber aware, businesses are tabling cyber projects, companies are building cyber capabilities, vendors are creating cyber solutions and consultancies are creating cyber practices to help you enhance your cyber resilience.

With all this hype, the key question is - what is different from the infrastructure and information security we have been doing for years and this new cyber approach? This is a good question because everyone seems to have a different perspective on cyber. And for very good reasons.

At Symantec, we get the opportunity to discuss the different interpretations of cyber with many types of users and businesses – consumers, small and...

John Santana | 28 Apr 2013 | 6 comments

Hi People,

I'm sharing the white paper that I have gathered and read through the weekend regarding the independent testing that benchmark the most common Anti Virus implementation in the industry. (see the attached files including the updated result as at September 2013).

The paper clearly indicates that Symantec Endpoint Protection outshines the competition due to the experience and the maturity in the Computer Security Industry.

Hope this article can be a helpful reference for you all.

Cheers !

darci_hunt | 10 Apr 2013 | 0 comments

The Critical Security Controls (CSC's) are being adopted by federal and state agencies in the U.S., Canada and elsewhere, to increase visibility into advanced threats, to shore up defenses, and ultimately for benchmarking and to improve risk posture.

To increase the limited information currently available about implementing the controls, the SANS Institute is conducting a 20-question survey for IT professionals, business unit managers and security/compliance experts. The survey was developed to find out what controls they're adopting, why, and how. The survey also explores how integrated the CSC's are in organizations that have adopted the controls, and whether any adopters have reached the stage where they can use the controls for benchmarking and to improve their risk postures.

"The Critical Security Controls are successful because of their open community approach - people and...

Brian Modena | 05 Mar 2013 | 0 comments

MOUNTAIN VIEW, Calif. – March 5, 2013 – Symantec today unveiled its Control Compliance Suite Vendor Risk Manager, enabling customers to better assess their third-party risk and protect their reputation and sensitive data. Control Compliance Suite Vendor Risk Manager provides a solid foundation on which to build a vendor risk management program.  Customers are able to gain visibility into their organization’s vendor risk exposure and automate the ongoing assessment of vendors’ IT security readiness.

Control Compliance Suite Vendor Risk Manager arms organizations with the following capabilities:

  • Auto-calculated vendor risk scores based on multiple evidence sources
  • Vendor tiering based on data risk and business criticality
  • Shared Assessments content for controls-...
Chaitali | 20 Feb 2013 | 0 comments

Issue: When a result of a Collection Evaluation Report job is exported in CSV format, the cells break - giving a non-uniform report output.

Cause: When the evidence of the failed checks is large, Microsoft Excel cannot handle the large character count of an individual cell. This causes the cells to break.

Explaination: The capacity of Microsoft excel to handle the length of cell contents is 32,767 characters. The first 1,024 characters display in a cell and the remaining appear in the formula bar. If the character count of the evidence in a cell is more than 32,767 characters, the cell will break. This is a limitation of Microsoft Excel.

Solutions:

Solution 1:

Instead of exporting the report in CSV format, export the result to excel by the following method:

Go to the Evaluation Result >> Select "Asset Based View" >> Highlight and select the assets >> Right Click on the assets >>...

Chaitali | 20 Feb 2013 | 0 comments

How to report on an agent based Unix Server hosting multiple databases

Desired reports:

- Reports from the Unix Host

- Reports from DB1, DB2, DB3

Refer to the diagram below:

Solution:

To report on the Unix Host:

Install the Unix agent on Unix Host.

- Register Interface 1 with BVIS using command:

  • /setup.sh -a <IP of BVIS> <IP of Interface 1> <Username> <Password> -s UNX

To report on DB1, DB2, DB3:

- Register Interface 2 with BVIS using the -lip (logical IP) commands:

  • /setup.sh -a <IP of BVIS> <IP of Interface 1> <Username> <Password> -s UNX -lip <IP of Interface 2>
  • /setup.sh -a <IP of BVIS> <IP of Interface 1> <...
Chaitali | 19 Feb 2013 | 0 comments

How to determine the cause of Scheduled Task or Query failure

Solution:

The cause of failed schedules can be determined from the Schedule Logs.

The logs for the RMS schedules are stored in text format at the following location: 

\Program Files (x86)\Symantec\RMS\data\<User Name>\ScheduleLogs

Note:

The name of each log file corrosponds to the name of the schedule in RMS.

These log files are automatically overwritten by the new log files after the respective schedule re-runs.

At any given point in time, one schedule in RMS has a corresponding one schedule log file from its latest run.