|11 Jan 2009||#1|
Primer on Device Support and Testing for Windows 7
As most folks (finally) get the beta and start to set aside some time to install and try out Windows 7, we thought it would be a good idea to start to talk about how we support devices through testing and work across the PC ecosystem. This is a big undertaking and one that we take very seriously. As we talked about at the PDC, this is also an area where we learned some things which we want to apply to Engineering Windows 7. While this is a massive effort across the entire Windows organization, Grant George, the VP of Test for the Windows Experience, is taking the lead in authoring this post. We think this is a deep topic and I know folks want to know more so consider this a kick-off for more to come down the road. –Steven
Devices and Drivers in Windows
One of the most important responsibilities in a release of Windows is our support of, and compatibility with, all of the devices and their associated drivers that our users have. The abstraction layer in Windows to connect software and hardware is a crucial part of the operating system. That layer is surfaced through our driver model, which provides the interface for all of our partners in the multi-faceted hardware ecosystem. Windows supports a vast range of devices today – audio devices (speakers, headsets…), display devices (monitors…), print, fax and scan devices, connectivity to digital cameras, portable media devices of all shapes, sizes and functions, and more. Windows is an open platform for companies across the globe who develop and deliver these devices to the marketplace and our users – and our job is to make sure we understand that ecosystem and those choices and verify those devices and drivers work well for our customers – which includes partnering with those device providers throughout the engineering of Windows7.
Drivers provide the interface between a device and the Windows operating system – and are citizens of the WDM (Windows Driver Model). WDM was initially created as an intermediary layer of kernel mode drivers to ease the authoring of drivers for Windows. There are different types of drivers. Class drivers (which are hardware device drivers that supports an array of devices of a similar hardware class where hardware manufacturers make their products compatible with standard protocols for interaction with the operating system) and device-specific drivers (provided by the device manufacturer for a specific device and sometimes a specific version of that device) are the two most common.
Support for our hardware partners comes in the form of the Windows Driver Kit (WDK) and for certification, the Windows Logo Kit (WLK). The WDK enables the development of device drivers and as of Vista replaced the previous Windows Driver Development Kit (DDK). The WDK contains all of the DDK components plus Windows Driver Foundation (WDF) and the Installable File System kit (IFS). The Driver Test Manager (DTM) is another component here, but is separate from the WDK. The Windows Logo Kit (WLK) aids in certifying devices for Windows (it contains automated tests as well as a run-time framework for those tests). These tests are run and passed by our hardware vendor partners in order to use the Microsoft “Designed for Windows™” logo on devices. This certification process helps us and our hardware partners ensure a specific level of quality and compatibility for devices interacting with the Windows operating system. Hardware devices and drivers that pass the logo kits tests qualify for the Windows logo, driver distribution on Windows Update, and can be referenced in the online Windows Marketplace.
Validation and Testing
With Windows 7 we have modified driver model validation, new and legacy device testing, and driver testing. Compared to Vista, we now place much more emphasis on validating the driver platform and verifying legacy devices and their associated drivers throughout our product engineering cycle. Data based on installed base for each device represents an integral part of testing, and we gather this data from a variety of sources including the voluntary, opt-in, anonymous telemetry in addition to sources such as sales data and IHV roadmaps. We have centralized and standardized the testing mechanics of the lab approach to this area of the product in a way that yields much earlier issue/bug discovery than in past releases. We have also ramped up our efforts to communicate platform or interface changes earlier with our external hardware partners to help them ensure their test cycles align with our schedule. In addition, we draw a more robust correlation between the real-world usage data, including recent trends, and prominence of each device and the prioritization it is given in our test labs. This is especially important for new and emerging devices that will come to market right before and just after we release Windows 7 to our customers.
Another important element in bringing a high quality experience to our Windows 7 users in device and driver connectivity and capability is the staging of our overall engineering process in Windows 7. For this release all of our engineering teams have followed a well structured and staged development process. The development/coding of new features and capabilities in Windows 7 was broken out in to 3 distinct phases (milestones) with dedicated integration and stabilization time at the end of each of these three coding phases. This included ensuring our code base remained highly stable throughout the development of Windows 7 and that our device and driver test validation was a constant part of those milestones. Larry discussed this in his post as some might recall. Program Managers, Developers and Testers all worked in super close partnership throughout the coding phases. Our work with external partners – especially our device manufacturer partners – was also enhanced through early forums we provided for them to learn about the changes in Windows 7 and also work closely with us on validation. Much more focus has been put on planning and then executing - planning the work and then working the plan. Our belief is that this yields much more predictability to developing and delivering our new features in Windows 7 both from a feature content and overall schedule standpoint. We recognize that this raised the bar on how our external partners see us execute and deliver on that plan when we say we will, but we also hope it increases their confidence in how they engage with us in validating the device experience during our development and delivery of Windows 7.
Determining Which Devices to Test
Our program management team helps us drive device market share analysis. Most of their data comes from our Customer Experience Improvement Program. This gives us data on the actual hardware in use across our customer base. For example there are over 16,000 unique 4-part hardware IDs for display devices alone. Like many things, we understand that it only takes a single device not functioning well to degrade an overall Windows experience or upgrade—we definitely want to re-enforce this shared understanding.
New devices typically have a small initial user base, but the driver will often be mostly new code (or the first time a code-base has seen a new device). As the device enters the mainstream, market share grows and most manufacturers continue to develop and improve their drivers. This is where for our customers, and our own testing, it’s important to always have the latest drivers for a given device.
Over a device’s lifetime, we work closely with our external device partners and represent as faithfully as possible in our test labs, a prioritized way of ensuring old and new devices continue to work well with Windows. By paying very close attention to trends in the market place across our device classes, we can make guided decisions in the context of these areas:
We use the notion of equivalence classes to help us define and prioritize our hardware (device) test matrix. Creating equivalence classes involves grouping things into sets based on equivalent properties across related devices. For example, imagine if we worked for a chemical company and it was our job to test a car polish additive on actual automobiles. Given a fixed test budget, we would want to maximize the number of makes and models we test our product on. We begin by analyzing the current market space so we can make the best choices for our test matrix.
Let’s say the first test car we analyze is a blue 2003 Ford Mustang. We also know that the same blue paint is used on all of Ford’s 2003 and 2004 models and is also used on all of Mazda’s 2005 models. This means our first automobile represents several entries in our table based on equivalence:
|My System Specs|
|21 Mar 2009||#5|
|My System Specs|
|Similar help and support threads|
Sierra Warless Announces Device Stage Support for AirCa
© Designer Media Ltd
All times are GMT -5. The time now is 15:55.