Video Screencast Help

True Data Loss Prevention

Created: 29 May 2009 • Updated: 03 Jun 2009 | 4 comments
Language Translations
Adrian Diaz's picture
+5 5 Votes
Login to vote

To all reading, not all data loss prevention solutions are equal.  This company had implemented the Vericept product before my time and the intent was to monitor for malicious and fraud activities by our Corporate Security arm.  Information Security was using it to try to capture information from leaving via the internet.  So when I started to look at the product I was not at all happy with what I found.  Creating appropriate policies was quite cumbersome and easily botched.  The filtering and querying of information was rather useless.  And the database management was a mess.  So I started to evaluate new products.  One solution was great at capturing information but their interface to structured data was iffy.  The reporting was horrible.  Another solution was also good but their operations and support was mostly a one man shop.  Then I came to the Symantec DLP/Vontu solution.  Of course discussing the product and evaluating was quite simple since we have a great relationship with Symantec and the support structure is amazing.

I noticed right away that this solution was built from the ground up and its true goal was to prevent data loss.  No adding pieces over time and then trying to mold something else for a best effort DLP attempt.  The interface is extremely user friendly, the filtering options are endless, the incident workflow is actually useful, and the reporting is quite useful.  The main selling point was that our non-technical Corporate Security team loved it and understood exactly how to use it.  In all reality the previous system has just collecting dust.  Another point that made the transition a no brainer is how easy Symantec made it to convert solutions.  The price point was lowered than before, training and services were included, and the hardware requirements were compatible.

So all we had to do was wipe clean the servers, install the new software, configure and go running.  The installation process went extremly smooth and happened in less than a day.  Both teams met for a day and we easily were able to create all the policies as previous along with using the out of the box policy templates.  For us the templates captured exactly what we were looking for.  We were running in less than 3 days total viewing and reporting on incidents.

Now the challenges and tips:
1.  Remember to involve IT teams in the decisions and implementation.  We did have to procure another server in order to have high availability and redundancy and this was an issue with IT.
2.  In order to turn on prevent for email you have to go inline with the prevent portion and all the mail has to route through these systems.  The network team had much concerns with email performance.  All worked great through our load balancers though
3.  Spend more time on training.  I had online web training and this never works out well.  too many interrruptions and loss of focus
4.  EDM indexing requires a custom script to run so make sure to obtain services to assist with this
5.  Spend a good amount of time learning how to filter incidents and create custom reports and views, this is important to really understand the incidents
6.  The system has a good workflow process for managing incidents.  However you have to define how the flow is going to occur.  make sure to train on this well or it won't be used.
7.  Prep management in the beginning on the difference between monitoring incidents and preventing/blocking incidents.  If you go at it alone and don't then you will be stuck on monitoring forever and then you won't get buy in for flipping to prevent mode.  The prevent mode is where users will be inconvenienced and you wil take heat.  You need management buy in and approval!

Other than that this system is incredible at catching data loss, the best out of any competitor.  You will be able to sleep at night.  Hope this helps everyone!

Comments 4 CommentsJump to latest comment

demaurojoe's picture

"The key to DLP is the ability to PREVENT data loss from the Network. As such, one has to focus on the data’s destination. It is conceivable that the data may be in Motion to the Internet or to Removable Media (through copy, save, save as, etc.) or to a Printer. DLP vendors call it Network, Endpoint and Printing. Some vendors also offer a Discovery piece; which is like a search engine for sensitive data anywhere on the Network.
Prevention requires that some outbound transmissions (outbound to the Internet or to Removable Media) be BLOCKED to prevent data loss. Monitoring outbound transmissions only means that you get a report on what security breaches have occurred. This is what is provided by most DLP vendors. This is because they originally built systems to monitor key-words in emails. Subsequently, they all had to improve the detection engines and to support other protocols. Nevertheless, the essence of their technology is traced back to providing Content Inspection; not Data Loss Prevention. So for the most part, they are still in the DLD business. They either cannot Block transmissions in real-time, or they only support a few outbound channels on a few ports. Some banks seem to be happy with monitoring just Webmail and SMTP. How come is they're not worried about HTTP Server traffic, or HTTP Tunnel traffic? Such traffic is the outbound streams generated from a Web server after an Internet users’ request. How many times do we hear about a breach originating from the Web Server?

The greatest secret of the industry lies in the accuracy of the detection engines. I submit to you that if a vendor has any degree of False Positives in detecting data, then you will never enforce Blocking Policies. You will only Monitor transmissions. In that case, you would be buying a Data Loss Detection System and you will need to be satisfied, like some bank CIOs merely to get reports on what security breaches have occurred.

many analysts have been following what vendors are defining as DLP. Some of them simply summarize vendors’ marketing materials. None of them are talking about accuracy of detection; which is paramount to any DLP system. After-all, it is only recently that Gartner has changed the name of the segment from Outbound Content Monitoring and Filtering, to DLP. When you are focused on Monitoring or Filtering, you are typically not concerned with breadth of Protocol support or with detection accuracy.

The market is still in its nascent stage. Most companies are looking to protect Personal Identifiable Information such as CCN, SSN, Telephone, email, etc.; mainly for compliance reasons. The situation is currently being aggravated by regulators. We now hear that Nevada and Connecticut introduced a regulation which requires companies to Encrypt PII. Encryption is a system which protects the “Hacker”. If users are able to encrypt emails, then Administrators will never be able to find out what was sent in such emails. The same can be said for encrypting files. Was this not in essence the case in the Heartland breach; where data left encrypted and 100 million credit card numbers were lost? The question that DLP answers is not whether the data left securely, but rather, whether the data should be allowed to leave the Network to begin with. Therefore, it makes sense for the DLP system to enforce encryption of data that require it and/or Block transmissions of high severity levels. In this way, administrators will be able to trace whatever data leaves the Network; even though encrypted."

referenced from:  www.gtbtechnologies.com . Having them send me an eval

+3
Login to vote
isrlev's picture

 Can any one can tell me what is the Best Practice
for installing the Detection Servers?
which one should i install first?

Network Monitor Server
Network Prevent Server(Email)
Network Prevent Server(web)
Network Server Discover/Protect
Ebdpoint Server
after installing the Enforce server 
0
Login to vote
yang_zhang's picture

all these Detection Servers are for different purpose, and, these Detections Servers' functional are different too. So, the first thing you need to confirm is: What you usage of the Detection Server?

If you only want to prevent the data lost on your endpoint computer, you can choose Endpoint Server.
If you want to find out where your confidential data are (for example, you want to find out the confidential data located on which table of your DB), you can choose Network Server Discover/Protect.
If you want to prevent to the confidential data lost from Email or web, you can choose Network Prevent Server (Email or Web).

In a word, you need to confirm your requirements before implement the Detection Server.

If a forum post solves your problem, please flag it as a solution. If you like an article, blog post or download vote it up.
0
Login to vote
UFO's picture

Training for staff is very important thing. Majority of employees do not understand why DLP is being introduced. They think that company doesn't trust them (which is partially true ;)), but they are not aware of, for example, potential damage that unintential data loss can cause.

Classifying data is also important and it should involve all departments. I know companies which have bought DLP solution but are still struggling with classification, because they were not good at prep stage.

STS: DLP

0
Login to vote