Video Screencast Help
Symantec to Separate Into Two Focused, Industry-Leading Technology Companies. Learn more.
Security Community Blog
Showing posts in English
Kari Ann | 05 Aug 2013 | 1 comment

 

SC Magazine conducted a group test of endpoint security products and reviewed Symantec Endpoint Protection 12.1.2 in the August 1, 2013 issue. Symantec Endpoint Protection 12.1.2 received a very positive review, receiving an overall rating of five out of five stars in this important trade publication. The review specifically called out SEP’s protection of millions of endpoints, SONAR engine, Insight technology and intuitive usability, concluding that, “Symantec has put together a solid product.” The full review can be read here

.scMag3.ashx__0.jpg

phlphrrs | 31 Jul 2013 | 2 comments

This question really surrounds the issue of security, whether or not some of the major cloud providers have the levels of security to protect enterprise users and information they claim to have and/or whether or not there is willingness to be open and frank about the levels of security or gaps thereof there are.  This has been a pretty common problem since the days of application hosting providers.  I recall these providers getting inundated with requests to have their environments audited ad nauseam against whatever the security standard du jour at that time.  Then there was the issue of encryption and whether or not that was appropriately designed and implemented – again we asked that they have an industry expert attest to the security of that encryption.  Then there was the requirement to audit the provider ongoing to ensure the security requirements remained in place over time.  Clearly, all these good things need to be done.  But, “why...

Wally | 29 Jul 2013 | 0 comments

We're testing SEP 12.1.2 on a 64-bit Windows 7 Pro client.    When we run a full scan, sometimes we get a large difference in the number of files scanned.   For example, sometimes SEP will report 170,000 files scanned, then if we immediately run another full scan, SEP will report 80,000 files scanned.  

The answer from Symatec Support is that this is normal behaviour for  the SEP 12.1 client.

Support says that the first full scan after an AV defs update rescans everything, including the file cache.  Subsequent full scans performed before the next AV defs update does not rescan everything as some files are marked as already having been scanned.   Support says the product was designed this way for performance.   

In our scans, we're seeing between 1,500 and 3,000 files trusted, but the apparently the number of trusted files are not the reason for the difference in the full...

MFox70 | 23 Jul 2013 | 0 comments

I attended a webinar recently which was talking about the move from physical to virtual servers in large corporations. The analogy used was that today, approximately 70% of all servers can be virtualised very quickly, but the remaining 30% can take several years of effort. Hypervisor vendors are working hard to sort this problem out, but the interesting finding was that a large section of that problematic 30% of servers are running legacy applications or are indeed legacy operating systems.

This is odd as you would think that any IT operations person would want to migrate a legacy server from physical to virtual hardware as soon as humanly possible.

 

Legacy systems are still around for a few reasons.

1 Laziness

2 Applications cannot be modified to work on newer OS platforms

3 Software Developers have long since left the company ( relates to point 2)

4 Legacy systems are connected to business critical servers, with little or no...

smartblogger | 22 Jul 2013 | 0 comments

An SSL certificate is a certificate that shows that the website is using Secure Sockets Layering for its connections. This means that the information that is transmitted through the site has been protected through the use of an appropriate encryption and decryption system. The SSL securing of a site creates a dual system of keys that are used to encrypt data and decrypt it later. On the website, any information that the site visitor enters into the portal, is encrypted using a public key. Therefore, when it is transmitted, it is transmitted as an encrypted piece of data that other people cannot get access to. Additionally, should they manage to get hold of this data; it will be meaningless as they will be unable to translate it into useful information. The second key is a private key that is held by the website owner. The website owner uses this to decrypt the encrypted data that the visitors of the website have transmitted. This translates it back into information that they can...

Mike Maxwell | 11 Jul 2013 | 0 comments

-- Originally published July 9th on StateScoop --

Every now and then, state and federal government agencies will collaborate on a project so sensible and logical, that it’s hard to find much—if any—opposition.

The FBI’s Criminal Justice Information Services Division (CJIS) might be one of those projects—a focal point and central repository for gathering and compiling intelligence from local, state, and federal criminal justice agencies.

But there’s another issue pertaining to CJIS that’s practically just as logical and obvious. And yet for some reason, we’re still struggling to put it into practice.

That issue is access control. Once we’ve gotten all of this information compiled, shouldn’t we take the necessary steps to secure it?

The problem isn’t the technology. (Strong authentication systems are readily available.) And it’s not a lack of desire. (It’s hard to...

Mike Maxwell | 11 Jul 2013 | 1 comment

-- Originally published July 2nd on StateScoop --

I was on a panel in Alaska a few weeks back, and the topic shifted to public- versus private-sector innovation.

The non-controversial part of my response was that state governments are innately a bit more cautious and deliberate in their technology decision-making. In fact, states tend to stay anywhere from a few months to a few years behind the private sector’s technology adoption pace.

But the mildly controversial part of my response was that I found this trend generally unproblematic. Unlike some government contractors, I’m not in favor of pushing bleeding-edge innovations on state government buyers before they’re ready.

But I have one exception to that rule—cybersecurity.

In the rapidly evolving, real-time-centric field of cybersecurity, there is simply no place for a two-year (or even two-month) lag in technology advancement.

In fact, if a state...

smartblogger | 04 Jul 2013 | 0 comments

SSL certificate has been used to secure credit card transactions, login and transfer of data. It has recently been used to secure browsing in social media sites. This certificate binds together domain, server, and host names. They are also used to bind company name and location. It is advisable for a company or organization to install this certificate in order to have secure sessions during browsing. SSL is the abbreviation for Secure Socket Layer. It is a protocol used to ensure safety of transactions between web servers and browsers. A website with this certificate ensures all participants in that space are secure, including the end users. There are different types of certificates. They include single, multiple domain, extended validation single domain, extended multi domain, UCC Exchange and Wildcard.

Purpose of SSL Certificate

This certificate is essential for online businesses and organizations. When running an online business, your...

MFox70 | 26 Jun 2013 | 0 comments

Patching.

 

It’s a painful topic for most IT professionals, seen as the eternal battle between keeping a system running, functionally up to date yet ensuring it is secure.

Some organisations I talk to have a monthly patching cycle, which takes a week out of every month to complete. Yes, 3 months of the year, they have teams of staff patching applications and servers. This is a costly and time consuming process, and I am sure these engineers would rather be doing something more interesting!

 

Yet it is arguable if a fully patched system really IS secure. Many hacking attempts and malware writers look for vulnerabilities that are not even discovered by the software vendors, a concept known as a Zero Day threat, so having a system that is patched against “ yesterdays’ “ threats is not exactly ideal. Let’s face it, malware writers and hackers create exploits quicker than corporates patch their systems.

...

linda_park | 25 Jun 2013 | 0 comments

Employees are the backbone of your organization, but they’re also the biggest risk to the very data that makes your business thrive. Whether an insider is malicious in their attempt to take your confidential data for personal gain, or they just don’t know better and mishandle confidential data thereby putting it at risk, insiders significantly contribute to data loss. In fact, according to the latest Ponemon Cost of a Data Breach study, human errors and system glitches caused 64 percent of data breaches last year, while the insider threat has remained the most consistent issue facing security teams over time, increasing 22 percent since the first study. But the insider problem is a solvable one.

To keep corporate data safe, people, processes and technology must holistically address the insider threat. Symantec offers the market-leading data loss prevention (DLP) solution to protect data at rest, in motion and in...