Windows 7 Forums
Welcome to Windows 7 Forums. Our forum is dedicated to helping you find support and solutions for any problems regarding your Windows 7 PC be it Dell, HP, Acer, Asus or a custom build. We also provide an extensive Windows 7 tutorial section that covers a wide range of tips and tricks.


Windows 7: Sapphire HD 4550 Doesn't allow 1680x1050


02 Mar 2010   #1

Windows 7 Home Premium (64 bit)
 
 
Sapphire HD 4550 Doesn't allow 1680x1050

I just got a new video card, and installed it. Everything went well except that the ideal resolution for my LCD monitor (Viewsonic VX2035WM) doesn't show up asd one of my options when I try to set resolution. I downloaded the latest (10.2) and did not use the CD.

I would hope for 1680x1050 resolution, but I don't see it as an option.

I see 1600x1200 which will display but the aspect ratio is off.
I am currently using 1440x900 which looks ok, but it is not exploiting my display to the best level possible.

On a side note, I just downloaded the Viewsonic driver, and it claims to not see my monitor... and I have never heard of a monitor driver.

The only display option is a Generic Non-PnP Monitor, which is what I assume is the problem. I am not sure what this entry said when I was running Vista. How do I get my system to see it as an LCD monitor so I can get the proper resolutions? Or is it something else that will do the same thing?

I worked fine with the on board Invidia graphics, but I don't want to go back.

My System SpecsSystem Spec
.

02 Mar 2010   #2

Windows 7 Ultimate 64-bit
 
 

Quote   Quote: Originally Posted by Little Darwin View Post
I just got a new video card, and installed it. Everything went well except that the ideal resolution for my LCD monitor (Viewsonic VX2035WM) doesn't show up asd one of my options when I try to set resolution. I downloaded the latest (10.2) and did not use the CD.

I would hope for 1680x1050 resolution, but I don't see it as an option.

I see 1600x1200 which will display but the aspect ratio is off.
I am currently using 1440x900 which looks ok, but it is not exploiting my display to the best level possible.

On a side note, I just downloaded the Viewsonic driver, and it claims to not see my monitor... and I have never heard of a monitor driver.

The only display option is a Generic Non-PnP Monitor, which is what I assume is the problem. I am not sure what this entry said when I was running Vista. How do I get my system to see it as an LCD monitor so I can get the proper resolutions? Or is it something else that will do the same thing?

I worked fine with the on board Invidia graphics, but I don't want to go back.
Connect straight DVI, no adapters and then try my tutorial here: Force DVI/HDMI resolutions and refresh rates
My System SpecsSystem Spec
02 Mar 2010   #3

Windows 7 Home Premium (64 bit)
 
 

Quote   Quote: Originally Posted by baarod View Post
Connect straight DVI, no adapters and then try my tutorial here: Force DVI/HDMI resolutions and refresh rates
Thanks,

I was wondering if there would be any advantage to switching to DVI from a VGA cable.

Now to shop for a DVI cable...
My System SpecsSystem Spec
.


02 Mar 2010   #4

Windows 7 Home Premium (64 bit)
 
 

I am now at the proper resolution...

A couple of updates.

I read on a different forum that someone had successs trying "Generic PnP Monitor" for the driver for this monitor. I changed it and rebooted, and was still unable to see the proper resolution.

I then looked in the Monitor Properties of the Catalyst Control Center (CCC) I was able to deselect the option to "Use Extended Display Identification Data (EDID) or driver defaults" and set Maximum Resolution to 1680x1050.

I rebooted, and tried setting the resolution again from the desktop and it still didn't show the right resolution choice.

I then went to CCC in the DeskTop Properties and was able to select 1680x1050. This did the trick!

I can now see 1680x1050 on the selection on the desktop as well.


Just for information, I am currently staying with my VGA cable.

In case someone searches for DVI and finds this thread, my card is DVI-I and my monitor is DVI-D. I was able to confirm by searching that to connect the two you can use a DVI-D cable. This is also a good option if you want to force DVI-I to use Digital (DVI-I is digital and analog) using a DVI-D cable forces it to use digital by not passing the analog wires through.
My System SpecsSystem Spec
02 Mar 2010   #5

Windows 7 Home Premium (64 bit)
 
 

Also, the card was beneficial.

Before installation my graphics scores were 2.8 and 3.0

After installation with the 1440x900 they were 5.7 and 6.2

After fixing the resolution, and set to 1680x1050 they were 4.9 and 6.2 (game score stayed the same)
My System SpecsSystem Spec
Reply

 Sapphire HD 4550 Doesn't allow 1680x1050




Thread Tools



Similar help and support threads for2: Sapphire HD 4550 Doesn't allow 1680x1050
Thread Forum
Solved I, too, cannot get 1680x1050 resolution Graphic Cards
radeon hd 4550 overclock PC Custom Builds and Overclocking
radeon hd 4550 vs gt 220 Graphic Cards
1680x1050 not there! Graphic Cards
1680x1050 on old GeForce Graphic Cards
Can't set 1680x1050 Graphic Cards
1680x1050 Graphic Cards

Our Sites

Site Links

About Us

Find Us

Windows 7 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 7" and related materials are trademarks of Microsoft Corp.

Designer Media Ltd

All times are GMT -5. The time now is 12:34 AM.
Twitter Facebook Google+



Windows 7 Forums

Seven Forums Android App Seven Forums IOS App
  

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33