The past two days I don’t want to repeat any time soon. We had to put our business-critical ERP upgrade project on the back burner due to a malware intrusion that originated through our mail system. The Warlord’s mail was affected; it completely shut down his ability to send-and-retrieve email. He called the C-man at home on Sunday evening and told him to drive over to his house and “fix the problem.” The C-man retrieved the laptop, told the Warlord to get a good night’s rest, and indicated that he would have a fully operational laptop by the time he arrived at the office in the morning. Of course, the C-man called to let me know that I could find “the laptop on my desk” upon my arrival that evening.
All systems down
All of this was bad enough, but we had just scraped the surface of the problem. The Warlord called the C-man shortly after he dropped the laptop off at the office. He was not a happy camper; he’d received two phone calls from customer executives indicating they had received “spurious” email from him. Berkeley called me at the same time; Ashby had called him to let him know that she had received a series of “spam” email from the Patman that matched the description of the emails that had gone out from the Warlord. It appeared we had a serious problem with our Exchange environment, and I put out an all-hands alert to the team to meet me at the office in 30 minutes.
We quickly discovered the Exchange environment had been comprised, and we had to shut down the entire system in order to avoid further spread of the malware. It took most of the night before we were able to pinpoint the cause of the issue. For remediation, we started with the data center servers, more than 500 of them, including our 14 Exchange servers. More than 50 of them had to be re-imaged; the remainder we simply patched to prevent further spread of the malware.
We then began with all of the laptops and desktop systems. The Warlord’s laptop was the first one on the list; Berkeley was completing its reimaging as the Warlord pulled into the parking lot. The rest of the systems took the remainder of the day. A number of end users were unable to use their laptops or desktops until late in the day.
Mail issue started with Patman spam
The cause of the problem was isolated to the Patman. He subscribes to a number of Patriot email lists and had opened an email advertising free Patriot hula dolls that contained a malware that targeted the Exchange server. When the Warlord, who is a Miami Dolphins’ fan, heard the genesis of the problem, he was livid and told the Patman that he had until the next morning to unsubscribe from every one of the distribution lists. (The fact that 50 different customer executives had received a Patriots’ hula doll advertisement from him was frosting on the cake.)
Beyond the orders to Patman, the Warlord also told the C-man that we had until the end of the week to identify a new mail security solution. Our existing solution would be “history” as soon as he “could punt it out the front door.”
Eugene, as you’re looking for a new mail security solution, check out Symantec Brightmail Gateway. The solution provides mail security for incoming and outgoing mail. In addition, you can add Symantec Brightmail Traffic Shaper to your environment, which will filter out up to 70 percent of mail volume and 80 percent of spam volume before it hits your network, thereby helping you to sustain performance on your Microsoft Exchange environment without adding more software and hardware to accommodate growing email volume caused by burgeoning spam rates. Also, you might want to check out the latest State of the Spam Report from Symantec and subscribe to the monthly RSS feeds.