Part 4: Choosing and Image Strategy and Building Window

    Part 4: Choosing and Image Strategy and Building Window


    Posted: 08 Nov 2009
    Now that we’ve talked about deployment all-up, data migration, and application compatibility, let’s focus on imaging. Now this is not the imaging that involves photos and cameras, but the imaging of computer disks.



    Quick History Lesson for System Imaging

    Imaging tools have been around for a long time and the most basic of which essentially backup an entire hard drive sector-by-sector and we and can restore that drive if desired to the same or another computer. This is basically a form of drive cloning and it was popularized in the 1980s and 1990s. Like I said in the first blog, this type of imaging when used for deployment in the enterprise is fairly archaic, since you’ll need to maintain an image per Hardware Abstraction Layer (HAL) type and often for people managing Windows XP, you’ll often see an image per language or region as well. What does this mean? Usually it means tens or hundreds of images to manage for many organizations, all requiring maintenance when “Patch Tuesday” or similar events or updates come around.

    But sector-based imaging can’t be that bad right? Well, let’s say you have everything centralized and you have 20 images to manage and up to 20 computers in your lab, once that critical update hits, we’ll spend an hour rebuilding each of those machines, maybe an hour configuring them and then up to three hours recapturing the image with sector-based imaging tools. That means 100 hours per month if you maintain things monthly or 1200 hours per year. To be fair, you aren’t clicking and configuring things manually the entire time, but it’s probably fair to say you’ll spend 2 hours per system across all tasks and you’ll eventually have terabytes of system images to find space for. If you are using the System Preparation Tool (sysprep) to generalize the image for installation on other computers, then you get only three passes of the tool per system image over its lifetime, so you generally need to capture each image before running sysprep and afterward, so you can start next month from the pre-sysprepped image, or else you need to start completely from scratch each time and apply all the changes since the last service pack.

    Fast forward to 2003 when engineers were determining the future of system imaging and along comes the Windows Imaging Format or “WIM” file. I was working with Microsoft’s Systems Management Server (SMS) and Solution Accelerator teams at the time and WIM was a prerequisite for the Operating System Deployment Feature Pack released back in 2004. WIMs are file-based and compressed images that can also save the contents of a drive. WIMs used with Windows XP were a pretty good option from a deployment standpoint based the reduced image size and ability to pass that package over the network, but they were still tied to one HAL type.

    Fast forward to around 2006 and early iterations of Windows Vista… WIMs used for Windows Vista and Windows 7 imaging and deployment take on a whole new meaning. Remember those tens or sometimes hundreds of images to maintain and up to five hours per month per image? With Windows Vista and Windows 7, you can get down to a single image per operating system architecture (i.e. a 32 bit image and/or a 64 bit image). As an example, right now I am in an airplane writing on a Fujitsu U820 ultra mobile tablet PC uniproc that I built using the same image I’ve applied to my bigger and less airplane-friendly Lenovo T60P 15” multiproc laptop as well as countless other hardware types.

    But it gets even better than just a single image to manage for all hardware (and languages by the way, too). Remember the five hours or so we would spend building, configuring, and recapturing that old sector-based image? With file-based images of Windows Vista or Windows 7, we can mount them to file folder and service them offline. In other words, I need one computer in my lab to use as a reference machine for all computers, I can use a free tool in the Windows Automated Installation Kit called ImageX to capture and apply system images, and I don’t necessarily even need to use that reference computer in my lab to service my one image on Patch Tuesday. I can mount the image in a folder on my image storage server if I want, use the Windows 7 and Windows Server 2008 R2 in-box tool called dism.exe (“Deployment Image Servicing and Management” in case you’re wondering) and enumerate the contents of the image to see packages, updates, drivers and features and modify those contents offline using dism.exe – again without building that reference lab computer. Those 5 hours it took you to apply the 3 critical Patch Tuesday patches can take as little as about 2 minutes to mount the image, 10 minutes to service it and 2 minutes to unmount it. I’m usually pretty happy if I can save 4 hours and 45 minutes performing an otherwise boring, but necessary task and instead of doing it 20 times and using 20 physical machines, I’d do it once. Makes sense, right?

    To show some of that, here is a video of sysprep and ImageX to generalize and capture a custom

    Video : Preparing an Image using Sysprep and ImageX

    And this is a video of dism.exe servicing an offline mounted Windows 7 image:

    Video: Deployment Image and Servicing Management

    I had to take a brief excursion from the deployment task at hand to give the history lesson, because in all my interactions and talking to IT pros and my desktop admin friends lately, I see two common things when it comes to imaging; 1. the majority of people I talk to are still using the sector-based imaging tools they’ve been using for decades and 2. the majority aren’t maintaining Windows Vista or Windows Server 2008 images, so they aren’t able to do offline image management. Even more troubling are the situations where Windows Vista or Windows Server 2008 are in place, but people are using the 20 year old tools and processes to manage them and aren’t even using or aware of sysprep, so an image per HAL type is still needed or lots of luck that the non-sysprepped image installs on foreign hardware (this scenario without using sysprep albeit unsupported is still somewhat common).



    Building Your Image

    Windows Vista and Windows 7 are delivered by a file-based WIM image and image-based setup. That DVD you might have or the ISO file you downloaded contains a 2+ GB file called install.wim in the Sources directory. The amazing thing about this WIM is that it actually can contain multiple operating system captures. In fact, the Windows Server 2008 R2 Enterprise image contains 8 operating system variants and Windows 7 Ultimate contains 5 variants. This would normally be larger than the Windows 7 Enterprise install.wim or a custom captured image with a single operating system image, right? Not really. WIMs use single instancing of shared files, so you can have multiple operating systems available in an image that might be about the same size as one captured operating system. This is important as you determine your image strategy, because you can do things like have multiple operating systems of differing languages packed into a single WIM file and even with multiple languages these should only be marginally larger than a single language WIM image. WIMs can also be used to compress and deliver data, so you can package multiple applications, drivers, packages into the data WIM, then mount and call them at install time using scripted OS installations.

    Now that you know a bit about WIM files, let’s cover the basics of imaging strategy. There are three primary strategies used for imaging and all are valid depending on the use case:

    1. Thick Image. I like to refer to this as the “old school” approach to imaging where you basically build a reference machine, install all possible applications to ensure users have the applications they could ever possibly need and usually more. Once that is done you apply software updates for the OS and all the applications, then you sysprep the computer and capture the image. Then you make sure everything works and ensure that sysprep didn’t affect any applications.

    2. Thin Image. This approach takes things to the other extreme. Little or nothing is installed on the reference computer and that is sysprepped and captured. Or some will just use the image as shipped in the Windows 7 retail DVD or ISO with zero customization. This strategy assumes you’ll be customizing the installation with applications and other necessary data dynamically at deploy time. This also means all of your applications are packaged for unattended installation or you are willing to pre-stage them for users to install when they want or you use something like Application Virtualization (App-V) so application profiles follow users regardless of the device they log into.

    3. Hybrid Image. In between Thick and Thin is a Hybrid Image, where applications that everyone uses or needs are captured in the base image (perhaps your VPN software, your anti-virus software, your version of Microsoft Office and the App-V client). Aside from those core applications, additional applications are layered on at deploy time based on user needs.

    All three of these strategies can be justified, though I personally tend to favor thin images. The thick image approach is useful in situations where the company has a homogeneous environment, uses a single language and all users use and need exactly the same set of applications. When using thick images in larger organizations, the trade-offs are that you pay for several applications that may not be necessary for all users, images are larger and multiple applications can affect performance, plus the image is more difficult to maintain and flexibility is greatly reduced. Thin images are the most flexible and easiest to maintain, but customizations need to happen at deploy time and that means applications are packaged for silent install and application updates can be installed silently as well. Installation speeds can be slower compared to thick images since each application needs to install itself one-by-one at deploy time and more automation is required. Hybrid images include many of the components of thick images, without necessarily wasting licensing costs, required disk space and often the performance hit of multiple unused applications.



    Getting to Thin Images

    If you currently use thick images, you might be asking, “What tools are there to move to thinner images then?” Enter deployment task sequencing. Recognizing the limits of using thick images, many people have developed task sequencing engines to not only install applications, but also perform the other common operating system deployment tasks in an automated way. Task sequences are extremely important for computer refresh and computer replacement scenarios, since they:

    1. Validate that the target hardware can install the operating system

    2. Capture user files and settings

    3. Invoke an installation environment like the Windows Preinstallation Environment (Windows PE)

    4. Customize the installation environment

    5. Apply the operating system image

    6. Apply drivers required by the hardware and connected devices

    7. Apply software updates

    8. Apply applications based on your selections

    9. Join the machine to a domain

    10. Re-apply user files and settings

    11. Configure additional things like BitLocker Drive Encryption or server roles

    All of this gets done in a completely automated way using deployment task sequencing – you launch it for a minute or schedule it centrally if using System Center Configuration Manager and the rest just happens without you needing to touch the machine. For someone new to the space, it sounds difficult to get configured, but this is a standard in-box task sequence from the free Microsoft Deployment Toolkit or the enterprise-class System Center Configuration Manager console.

    Here’s a video of what preparing a build looks like using the Deployment Workbench in the Microsoft Deployment Toolkit 2010:

    Video: Deployment Workbench in Microsoft Deployment Toolkit 2010

    The task sequence brings together the tools we need for the deployment to end-to-end. I like to think of everything we’re using in terms of music. If you think of unattend files, the User State Migration Tool, Windows PE, applications, and drivers as instruments, then the task sequence is like the conductor and sheet music. The end product is a symphony of automation that you have complete control over. Once everything is finished and ready for automation, you can pick how you want to deliver your builds. But we won’t cover that here, let’s save that discussion for the next blog on image and build delivery.

    Until then, I’ll leave you with a fully-automated migration with user data from Windows XP to Windows 7 that I built myself (but did not narrate) using the free tools described above:

    Video: Windows XP to Windows 7 Migration

    Thanks for reading and happy deploying,

    Jeremy Chapman

    Windows Deployment


    More...
    z3r010's Avatar Posted By: z3r010
    08 Nov 2009



 

  Related Discussions
Our Sites
Site Links
About Us
Windows 7 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 7" and related materials are trademarks of Microsoft Corp.

© Designer Media Ltd
All times are GMT -5. The time now is 19:46.
Find Us