I'm not sure of the exact spec of the card, as I'm not with the PC right now, but it's the card that came in an Acer Media Centre PC (Aspire iDea 510, I think). Anyway, someone made a stupid choice when designing this piece of kit. It's designed to be used with TVs, both HD and SD. Most (if not all) "HD Ready" TVs, which are capable of displaying 720p pictures, actually run at 1366x768, afaik, like my TV (Toshiba something). So when someone put in a graphics card that doesn't support 1366x768, that was the start of the problem.
The TV has never managed to successfully fill the screen with a 1280x720 image, either through HDMI or VGA (well, DVI out on the PC to VGA into the TV), always either cutting bits off or leaving a big black box (don't ask me why, but I have tried every possible combination), so we've just lived with having it run at 1280x768, and ignored the small black strips on either side.
Sunday, I installed formatted the thing and installed Windows 7. Microsoft goes and gets Intel's drivers
for the card, and we're faced with the exact same problem, and I was wondering if there was any way to solve it.
Weirdly, at 1280 x 768, even though it shouldn't be doing any pixel stretching (or whatever), it still seems to have the effects you get when displaying an incorrect resolution on an LCD (slightly blurry, fuzziness when moving the mouse etc), but at a resolution of say 1024 x 768, with big black sides, it looks nice and crisp.
If anyone's got any useful information or ideas, please let me know.