Video Screencast Help

Scheduling LiveUpdate content (definitions) without installing LiveUpdate Administrator Server?

Created: 07 May 2014 • Updated: 08 May 2014 | 7 comments

I work in an environment with about 450 servers, and about 200 of our workstations are all virtualized and share a SAN for storage. We have 1500 total clients. We use 12.1.4 SEPM, and have not had / noticed this issue until after we updated to 12.1.4 from 12.1.2. We are not running VShield yet, but may be moving to that in the future.

I am having a lot of issues with IO access on the SAN because they are all wanting to update definitions at the same time....

I worked with support, and they recommend we install and setup a LiveUpdate Administrator Server, so that i can use the Scheduling feature in the LiveUpdate Policy.

I see that to be alot of extra work, when the SEPM server is doing fine hosting the Definition files...

There is alot of best practices and other documentation regarding virtual environments, and such, that some even contradict themselves between so I am hoping that someone will have some "real life experience" as to what they did in such situations...

I see the following as solutions w/o building out another server *that some documentation say should not be a VM, while other say it can be....

1. I did expand the "Disk Space Management for Downloads" to 15. Some time ago it was set to 2 because of storage space, but prior to that it was set to 6 since versions 11 install. Somewhere I had read that lowering was safe, and I believe we were having issue with it downloading the full definition file every time it updated regardless of how many it was maintaining (that was some tiem ago). If it were downloading only the "small update" I can see where the write time to the SAN would be much lower. But if I recall correctly it was not functioning that way.

2. Download Randomizer expanded window: I see it can be set up to 48 hours, I could see where changing this to the maximum may reduce the number of systems downloading updates at the same time. *The tech who supported me stressed "5 minutes is best practice" though the "best practice for virtualized environment" states 2 hour or more...

3. Group Update Provider: I was thinking by limiting the number of simultaneous connections would reduce the number of systems downloading updates... I was thinking using the SEPM as the "GUP" and setting the servers to talk to it, and there for reduce the number of systems writing to the SAN at one time...

Any input on this would be great.

Thanks,

Discussion Filed Under:

Comments 7 CommentsJump to latest comment

.Brian's picture

With that many clients, the revisions at 15 should be fine. It's roughly 5 days worth of updates.

The key here will be the GUP setup. Putting a GUP at each LAN will reduce traffic instead of coming back over the WAN for updates.

48 hours is high for the randomizer, 2 hours is much better. Even 4 would be OK

Please click the "Mark as solution" link at bottom left on the post that best answers your question. This will benefit admins looking for a solution to the same problem.

Colin Barr's picture

With the wide spread documentation on how to set what in Symantec, along with no real explenation as to how some settings affect other settings, and "big picture" as to what needs coordianted in scheduling, I have set the following for our environment.

I am slowly phasing these settings into our virtual environment, and will follow up a post with "final results" Monday once I fully verify that it is working as desired. If something below is not correct in my understand please let me know. This is "patched" together on the "best process" based on many documents / posts / forums...

1. SEPM dowloads current definition files every 6 hours

2. SEPM maintains 20 revisions (6-7 days)

3. All servers / workstations contact SEPM with a Heartbeat of 1 hour. At that time they may update policy if it has changed, but wait to download definitions till "scheduled randomization window"

4. If definitions need updated, it will download them based on the "randomization window"

  • Servers (mainly VM) 4 hours
  • VM Workstations - 4 hours
  • Physical Workstations - Local site to SEPM 1 hour
  • GPU 10 minutes (so they can provide updates for remote site prior to that site trying to download them)
  • Remote Site Systems -  2 hours, this provides more time for GPU to push updates to systems

I believe this will keep the revision updates to clients much smaller, and consistant.  This means if a system checks into SEPM 2 minutes before SEPM gets updates, it will still get the new update, with a 1 hour window before SEPM gets updates again, eleviating the "duplicate packages needed" by a system.

 

Questions:

The GPU's, do they only download the diffy revision update, or the "whole package" just as the SEPM does?

Does the GPU OS mater if it will provide updates to the systems set to pull from it? Such as, I have a Server 2008 R2 x64 set as a GPU for a site that has XP x32, and Win7 x64.

Thanks again.

.Brian's picture

The GUP will download the full package in addition to the deltas.

The OS does not matter. Any machine with a SEP client can become a GUP and the GUP will serve content to any other SEP client, regardless of OS.

Please click the "Mark as solution" link at bottom left on the post that best answers your question. This will benefit admins looking for a solution to the same problem.

Tyler A's picture

We are also seeing high SAN I/O utilization when Symantec updates its virus definitions.  Is there a way to share a repository for definitions across multiple servers?  What are the best ways to reduce I/O utilization in a large virtualized environment?

Colin, please update with results when available.

 

Thank you.

HighTower's picture

Do you have "Run an Active Scan When New Definitions Arrive" enabled?  That's on by default.

Virus and Spyware Protection Policy > Windows Settings > Administrator-Defined Scans > Advanced > uncheck "Run an Active Scan When New Definitions Arrive"

 

HighTower's picture

Shared repository is something they've talked about but is not yet available.  I agree that that would make a TON of sense, especially if you're using the Virtual Appliances.