Video Screencast Help
Security Community Blog
Showing posts in English
darci_hunt | 14 Aug 2013 | 0 comments

Today, nearly all of an agency’s mission-critical functions depend on safe and secure information technology systems. With cyber threats ever evolving and growing at an exponential rate, and increased reliance on technology to deliver core services in government, a robust cyber defense is needed by agencies.

Continuous Monitoring is certainly not a new term, but if you were to ask 10 people how they would define this term, you’re likely to get 10 different responses. Ken Durbin, Cyber & Continuous Monitoring Practice Manager, Symantec, provided expert insights on Symantec’s view of Continuous Monitoring and how agencies are adopting continuous monitoring programs as a means to protect government data and infrastructure. Durbin also highlights the benefits, best practices and challenges to adopting a continuous monitoring program.

Continuous monitoring is one part of a six-step process in the NIST Risk Management Framework (RMF), from NIST...

Tariq Naik | 06 Aug 2013 | 0 comments

This article is written based on wide spread Internet reports from Black Hat Conference at Las Vegas.

Recent advances in math and cryptology research in the academic field indicate that there might be mathematical algorithms or solutions in place to break RSA and Diffie-Hellman based encryption without obtaining the secret key and without the need of massive computing resources for significant durations of time within the next four to five years. These encryption schemes are widely on the Internet today for keeping sensitive date private right from encrypting Internet communications used for electronic commerce to securing software updates to encrypting global corporate and government networks.

The key to the security today is that there are no practical ways or efficient algorithms which can break these encryptions without obtaining the secret keys. The day such algorithms are found the encryption and hence the trust on which the Internet works will be broken.

...
Kari Ann | 05 Aug 2013 | 1 comment

 

SC Magazine conducted a group test of endpoint security products and reviewed Symantec Endpoint Protection 12.1.2 in the August 1, 2013 issue. Symantec Endpoint Protection 12.1.2 received a very positive review, receiving an overall rating of five out of five stars in this important trade publication. The review specifically called out SEP’s protection of millions of endpoints, SONAR engine, Insight technology and intuitive usability, concluding that, “Symantec has put together a solid product.” The full review can be read here

.scMag3.ashx__0.jpg

phlphrrs | 31 Jul 2013 | 2 comments

This question really surrounds the issue of security, whether or not some of the major cloud providers have the levels of security to protect enterprise users and information they claim to have and/or whether or not there is willingness to be open and frank about the levels of security or gaps thereof there are.  This has been a pretty common problem since the days of application hosting providers.  I recall these providers getting inundated with requests to have their environments audited ad nauseam against whatever the security standard du jour at that time.  Then there was the issue of encryption and whether or not that was appropriately designed and implemented – again we asked that they have an industry expert attest to the security of that encryption.  Then there was the requirement to audit the provider ongoing to ensure the security requirements remained in place over time.  Clearly, all these good things need to be done.  But, “why...

Wally | 29 Jul 2013 | 0 comments

We're testing SEP 12.1.2 on a 64-bit Windows 7 Pro client.    When we run a full scan, sometimes we get a large difference in the number of files scanned.   For example, sometimes SEP will report 170,000 files scanned, then if we immediately run another full scan, SEP will report 80,000 files scanned.  

The answer from Symatec Support is that this is normal behaviour for  the SEP 12.1 client.

Support says that the first full scan after an AV defs update rescans everything, including the file cache.  Subsequent full scans performed before the next AV defs update does not rescan everything as some files are marked as already having been scanned.   Support says the product was designed this way for performance.   

In our scans, we're seeing between 1,500 and 3,000 files trusted, but the apparently the number of trusted files are not the reason for the difference in the full...

MFox70 | 23 Jul 2013 | 0 comments

I attended a webinar recently which was talking about the move from physical to virtual servers in large corporations. The analogy used was that today, approximately 70% of all servers can be virtualised very quickly, but the remaining 30% can take several years of effort. Hypervisor vendors are working hard to sort this problem out, but the interesting finding was that a large section of that problematic 30% of servers are running legacy applications or are indeed legacy operating systems.

This is odd as you would think that any IT operations person would want to migrate a legacy server from physical to virtual hardware as soon as humanly possible.

 

Legacy systems are still around for a few reasons.

1 Laziness

2 Applications cannot be modified to work on newer OS platforms

3 Software Developers have long since left the company ( relates to point 2)

4 Legacy systems are connected to business critical servers, with little or no...

smartblogger | 22 Jul 2013 | 0 comments

An SSL certificate is a certificate that shows that the website is using Secure Sockets Layering for its connections. This means that the information that is transmitted through the site has been protected through the use of an appropriate encryption and decryption system. The SSL securing of a site creates a dual system of keys that are used to encrypt data and decrypt it later. On the website, any information that the site visitor enters into the portal, is encrypted using a public key. Therefore, when it is transmitted, it is transmitted as an encrypted piece of data that other people cannot get access to. Additionally, should they manage to get hold of this data; it will be meaningless as they will be unable to translate it into useful information. The second key is a private key that is held by the website owner. The website owner uses this to decrypt the encrypted data that the visitors of the website have transmitted. This translates it back into information that they can...

Mike Maxwell | 11 Jul 2013 | 0 comments

-- Originally published July 9th on StateScoop --

Every now and then, state and federal government agencies will collaborate on a project so sensible and logical, that it’s hard to find much—if any—opposition.

The FBI’s Criminal Justice Information Services Division (CJIS) might be one of those projects—a focal point and central repository for gathering and compiling intelligence from local, state, and federal criminal justice agencies.

But there’s another issue pertaining to CJIS that’s practically just as logical and obvious. And yet for some reason, we’re still struggling to put it into practice.

That issue is access control. Once we’ve gotten all of this information compiled, shouldn’t we take the necessary steps to secure it?

The problem isn’t the technology. (Strong authentication systems are readily available.) And it’s not a lack of desire. (It’s hard to...

Mike Maxwell | 11 Jul 2013 | 1 comment

-- Originally published July 2nd on StateScoop --

I was on a panel in Alaska a few weeks back, and the topic shifted to public- versus private-sector innovation.

The non-controversial part of my response was that state governments are innately a bit more cautious and deliberate in their technology decision-making. In fact, states tend to stay anywhere from a few months to a few years behind the private sector’s technology adoption pace.

But the mildly controversial part of my response was that I found this trend generally unproblematic. Unlike some government contractors, I’m not in favor of pushing bleeding-edge innovations on state government buyers before they’re ready.

But I have one exception to that rule—cybersecurity.

In the rapidly evolving, real-time-centric field of cybersecurity, there is simply no place for a two-year (or even two-month) lag in technology advancement.

In fact, if a state...

smartblogger | 05 Jul 2013 | 0 comments

SSL certificate has been used to secure credit card transactions, login and transfer of data. It has recently been used to secure browsing in social media sites. This certificate binds together domain, server, and host names. They are also used to bind company name and location. It is advisable for a company or organization to install this certificate in order to have secure sessions during browsing. SSL is the abbreviation for Secure Socket Layer. It is a protocol used to ensure safety of transactions between web servers and browsers. A website with this certificate ensures all participants in that space are secure, including the end users. There are different types of certificates. They include single, multiple domain, extended validation single domain, extended multi domain, UCC Exchange and Wildcard.

Purpose of SSL Certificate

This certificate is essential for online businesses and organizations. When running an online business, your...