In Defense of Hybrid Images

I received an e-mail today from a reader who was asking about why I use two deployment shares, why not just one? In his defense, I think he was confused, but I’ll try again to stress that MDT can be used to build images for us, and that this is preferred for many reasons, the least of which is just saving time. If I can automate it, why do it by hand? Anyway, here’s a rant laced with excerpts from the exchange.

I use two shares, one to build images and another to deploy them. The first I name BuildShare and the second I name ProductionShare. In my image, I put Office, DotNet Framework, Visual C++, and Silverlight. I don’t put much else in there, but I let MDT patch those and then there’s some reghacks I run for branding and/or defaults I want to bake into the image. This is just a standard client task sequence that’s set to capture the image for me at the end since it IS a separate share and it is set to do captures at the end of running task sequences.

buildcs
The Build Share’s CS.ini File – Tweaked to Capture Images

This is the reason I use a separate share. It is so that I can configure the customsetting.ini file to automate a large part of the capture process. I simply set DoCapture to yes and by running a standard client task sequence, MDT will capture the image for me! I can then take a fully patched system and send it off to the production share. That’s what installs Flash, Java, Adobe, all that crap that’s always changing or depending on dept. Its easier to just update the MSI in the share once and be done with it.

This task sequence is fully automated, the hardest thing I have to do is simply boot a VM and select what TS I want to run, and go get some tacos. MDT builds and captures the image for me.

Why even image at all? Why not just install everything by hand? We use images to save time. Its all about that final deploy time. Often when you need to re-image a broken computer, that PC needed to be ready hours ago, but it’s not, it’s dead because some user clicked god knows what in an e-mail. Deployment’s more than just about bare metal installs, this is about being able to use operating system deployments for break / fix. The better patched that reference image is, the faster we can get users up and running and back to work. Fewer things get clients excited than telling them that not only can MDT image the PCs, but it can build these images for us as well. My build lab builds me a new “shinny” image every month and our deploy times stay under half an hour. The older I let reference images get, the more patching they need before I can let a user sign in.

buildts
The BuildShare builds me a variety of images, Win7, Win10 and Server in just a few clicks.

Having a build share allows me to build a new image and automate that build process and save hours in the process. This then also has the advantage that I can reliably repeat the EXACT same capture six weeks or six months later when I need a new image with all the new patches, and I can keep my deployment times low.

The time to install office may seem minimal now, but the time to patch office, dotnet framework, visual C++ and the core OS grows and grows and grows as the image gets older and older. Simply put given enough time this patch time balloons from a dozen updates into hundreds of updates and your ability to patch a thin image on the fly becomes less and less reliable. I’ve seen a 14 month image go from 28 minutes to deploy to almost three hours thanks to office and dot net service packs that need to be downloaded and applied. Its just more reliable to have a TS that can cook a new image for me that I can reliably run once in a while to keep those times down because I can promise you the one hour difference you see now between a thin and hybrid image today becomes more like three hours a year from now. I’m not suggesting you put java, flash and adobe reader in the image, just put office as well as visual C++ runtimes and dotnetframework because MDT can at the very least patch those prior to running sysprep and capture for me. Think of this ‘hybrid image’ as a software installation platform that your core line of business applications will need to sit on.

With all that being said, its important to note that there are very specific things that should be done on your production share’s task sequence such as domain join, windows activation, bitlocker, and antivirus installs.

For a complete guide on how this is done: See Johan’s guides: I follow them religiously for one reason: deviation from his teachings all but guarantees pain. Compliance ensures happy deployments. ’nuff said.

SEE: http://deploymentresearch.com/Research/Post/496/Building-a-Windows-10-Reference-Image-using-MDT-2013-Update-1

Advertisements