How can this be so hard? Why does Windows make it impossible to change monitors?
Here's a summary of the problem, with details below:
System: Windows 7-64 HomePremium SP1 with all important updates, Radeon HD 4830 video card. Gigabyte MB, Intel E8500 CPU, 8GB memory.
My old monitor gave out a couple of days ago, so I went to the store yesterday and got a new one. It does not have a resolution mode that matches my old monitor, and I cannot turn that resolution off, so I keep getting "No Signal" when the video adapter is enabled. The monitor works fine in safe mode and with the video adapter disabled, but every time I try to reinstall the video adapter, it goes back to the old, incompatible resolution, and I get "No Signal."
Here are the details:
I hooked up my new monitor (to the HDMI output, same as the old one), turned on the PC, the monitor showed it going through the BIOS startup, and the Windows startup, but just before the login screen, it went black and said "No Signal."
OK, I know what the problem is. My old monitor had a native resolution of 1650 by something, and my new one is like 1920x1080, and doesn't have a 1650 mode. So I reboot and start Windows in safe mode. Now I can log in to Windows at 800x600. It has an ugly black background, but it works.
I go into Device Manager and disable the video adapter (my card is a Radeon 4830), and restart. Now Windows comes up in its normal pretty blue background. I go into Display, change the resolution to 1024x768, and that looks OK, but wastes a lot of space on the screen. I enable the adapter and restart.
I restart in safe mode, and this time instead of disabling the video adapter, I uninstall it. I restart, but as soon as I log in, Windows tells me it has automatically installed a Microsoft version of the Radeon driver
, and I need to restart. When I do, I get "No Signal."
I restart in safe mode, disable the adapter again, restart, change the resolution to 1024 x 768, download the latest Radeon driver from AMD, install that, and restart. No signal.
I go through the safe mode-disable-restart cycle again, get on the net, and look for a solution. There seems to be a consensus that the only way to fix this is to uninstall the display driver, delete all the AMD folders, and then install a new driver. But Windows won't let you do that, because it automatically installs a Microsoft driver.
Half the answers say to turn off automatic updates, but many people say that won't help if the driver is already on the system, which in my case it is. The only way to prevent Windows from installing an existing driver is to type
from the start menu search box, and then you can check a box that tells Windows not to install drivers
But if you have the Home Premium version of Windows 7, there is no groupedit.msc; it's only on Pro and Ultimate. And there's apparently no workaround for it. I thought I found one when a few websites recommended a hack that was posted on this forum by davehc, but when I looked for it, I found that he disavowed it, because it didn't work reliably.
So how can this be? I've wasted a whole day trying to fix this. Why doesn't Microsoft let HP users turn off automatic driver installs? Why do Microsoft and AMD both insist on installing new display drivers with the old resolution, instead of something safe like 800x600? Surely I'm not the only guy who got a new monitor that's incompatible with the old resolution, and surely it's easier to take ten seconds to change a safe resolution to whatever you want the first time you boot with a new driver, than to tear your hair out trying to get out of an incompatible resolution.
There has to be a way to fix this without re-installing Windows.
I'm very comfortable with editing the registry, etc., so I'll be happy to do that if somebody tells me what entries to edit.
Or can I just delete the Microsoft version of the AMD display driver, so it can't automatically install it? If so, where is it, and what is it called?
if anyone can get tell me what I need to do to fix this, I will be eternally grateful.