Is this feasible?, what I am planning to do. Please Advice.
I work at a large college I have been given the daunting task of consolidating our 37 + images for all the different hardware we have in house, into a handful of images. To get the job done I plan to use sysprep to make a universal base image which will contain all the drivers we have for our current hardware we have. Once I have verified the base image works on all the hardware we have in house these are the next steps I plan to take.
1.Make a base universal image of all the apps I know that goes on every computer we have in the building. I plan to make this image as "thin" as possible. After this is done I plan to script the installations of the unique applications needed after the base universal image has been laid down.
I have a few questions and concerns that I am hoping someone here can answer.
1.To script the applications after the thin universal image has been laid down, what kind of server would I need?, will it take an absorbent about of bandwidth to pull what I am planning off. Should I be looking into setting up a packaging server?, what kind of hardware would I need to have. One of my major concerns is if 45 machines connect to a packaging server simultanously to install a particular package, will the traffic that will be generated be unicast or multicast?. How much bandwidth would I need?
2.I am planning to build all my reference image using virtual machines is this a good idea, is there a down side?.
Thanks for anyone help or advice you can pass my way before I begin this daunting project.