Video Screencast Help

Windows fails after image succesfully completed

Created: 16 Jul 2012 | 5 comments

...I am imaging down an imported Ghost image to several machines.  The task completes perfectly but when the machines reboot I am getting the following error message:

Windows failed to start. A recent hardware or software change might be the cause:

I have tried multiple images coming down to multiple machines and they are all getting the exact same error (When Windows first reboots.  It doesn't even go into sysprep first)

These images were working fine last week but this week they are doing this.   When I just use normal ghost (Same images) the machines come up flawlessly.

Any clue would be greatly appreciated


Comments 5 CommentsJump to latest comment

yogeshsadhu's picture

You should perform hardware independent imaging when you deploy an image on different set of hardware. The machine will fail to boot if it finds new devices for which device drivers are missing.

You have not mentioned the Deployment solution version which you are using , please mentioned that.

Make sure you have enabled the option DeployAnywhere whenever you perform the imaging task on the machine different than the base machine from where you have taken the image.

Also make sure you have added required drivers in to the driver database , For example : If you have taken an image of the machine which is an IBM machine and you wanted to deploy this image on Dell and HP machine , then you required drivers for Dell and HP in your driver database , you will find the drivers in the driver CD comes along with the hardware or you can download drivers from these vendor website. You can add multiple drivers to the driver manger.

Let me know if you need any more information.

Yogesh Sadhu.

If you feel your issue has been addressed,please use the "Mark as Solution" link for the relevant thread.

jtjohn1's picture

Using DS7.1 (Ghost imaging.  Actuall version is 7.1.8280

Using our own unattend.xml file

Images are coming FROM a Dell Optiplex 760 and going back ON a Dell Optiplex 760 (Though they are differant 760s)

Images have been sysprepped before being imaged up (Using Ghost to image them up then importing them into the DS)

Last week the images were working fine (Tested a couple of differant machines using a couple of differant images)

This week none of the images seem to work

I just tried the deploy anywhere and it fails at the exact same spot (It doesn't go into sysprep at all it just boots up and fails with the error message)

The only real differance between last week and this week is I licensed it (We already had an Altiris license at the college I work at but weren't using the DS.  I downloaded the demo and played around with it and got it working well enough that we want to move it into a real test enviornment)

Thanks for any help you can provide!!

james_g's picture

why in this day and age would you be using a sysprep'd image and why would you be using deployanywhere, or ghost, all ancient, archaic and legacy solutions to modern day deployment architecture that has migrated to a much more advanced platform, solution and architecture where images are deployed in place from original source code with hardware specific drivers that have been extracted from published packages that contain more marketing content than anything else !

and to add to the comments above, why would you implement a imaging solution that was chocked by network capacity to a central server when you could implement a scalable imaging solution which was dependant only on the number of pieces of hardware that you wanted to image and the physical connections of the hardware itself to your network ?

in our real world scenario the capacity to image 5-50-500-5000 workstations is realistic, while each imaging event is self contained and independant of only commands on the DS server

my two cents worth ...

jtjohn1's picture

Well instead of telling me about how archaic my solution is why don't you tell me what, exactly, your much more modern solution is??

We have tried several differant deployment solutions such as Kace and Acronis.  They work FINE for a plain vanilla image or deployment.

Start adding in MASSIVE full software packages such as the full verision of Adobe Master Suite + the full version of Autodesk/Auto Cad (In most of it's various packages + 3d studio max + Virtual machines on the clients computers) and they start to break down.  The same image that takes 4+ hours to image down on Kace takes 48 minutes to image down on DS7.1 using Ghost.

One of our lab images that goes onto about 200 machines is over 120GBs.  Installing the software MANUALLY from a flash drive takes quite awhile.  Installing them from the network takes many hours.  Having them as part of an image takes less than an hour and we know they are all EXACTLY the same before we freeze the machines.

If you have a suggestion on a more modern and robust platform by all means tell us about it (I am ALL for looking at something else that works)

Otherwise your just waisting my time.

Just my .02

BBC's picture

In very brief, the process we use looks like this:

1. Use PXE to deploy a local WinPE which boots into RAM;

2. Reboot to the local WinPE;

3. Tokenize the files required for the client to finish off its configuration;

4. Prepare the HDD;

5. Copy over the OS source (plain source content for a unattended install);

6. Copy over the core applications that will be installed on all clients;

7. Identify the hardware and copy over the respective driver set for that hardware **

8. Run the unattended installation of the OS and once the initial part is done do NOT automatically reboot the client, but let the last script we execute thru the console, have an EXIT command in the end whcih forces the WinPE session to quit and the client to reboot;

9. Now inactive on the console, the OS install finishes, core applications are installed ***

10. The next task sitting on the console and waiting to execute is in "Production" environment and will finish off some tasks like cleaning up no longer needed contents etc.

** I use a locally stored VBS and call that thru a DS task where from within the VBS I run WMI commands to collect the product name (i.e. ThinkPad blablabla, HP Compaq blablabla, etc.). This allows me to update the drivers where I need to and avoid conflicts with later or older versions that might be required. In addition, if I look thru forums here and see how many issues DeployAnywhere causes, I'm just happy NOT to use that feature !

*** For core application installations, ither a simple CMD file, VBS, whatever script could be used or as we do, a graphical interface for those who love to watch clients install. Each operation during the application installation is logged and so we can always check the respective LOG files if anything has gone wrong.

In case of any task fails on the console against the client, I always use an e-Mail notification, which alerts me with the details I want to get back for such an unlikely event.

What I also have done is creating a migration process from XP 32Bit to Windows 7 64Bit, which is about equal to our imaging process and still goes along with best practices from Microsoft.

Instead of updating one or more Sysprep-Images on a regular basis or having WSUS or other architectures have to shoot out big amounts of patches and application updates post imaging, I can always incorporate the latest content right into the process and do not have to replicate out GB amounts of data to any site (a real Enterprise environment), I only replicate these smaller contents.

If you question how long this build process takes, around 3 hours a-z, which to me is an acceptable time frame regarding I can also use this very same process using a pen drive and much lower maintenance effort.


PS: We have a fundamental thought "K.I.S = Keep it simple" and try to stick to that wherever we can and have been doing much better than many other out there as that works out best - at least for OUR real Ent. environment