New
#21
I think the issue maybe comes from the graphic card itself. It works on xp but not with Vista or Seven... (sorry for my english, i speak french:))
I think the issue maybe comes from the graphic card itself. It works on xp but not with Vista or Seven... (sorry for my english, i speak french:))
For me its working on vista and xp but not in win 7
Have you tried powerstrip? EnTech Taiwan | Utilities | PowerStrip
I was able to force my 4 year old laptop to output 1900x1080p even though it's max resolution was well below that. :) Might want to give it a try.
Whats the native resolution of your monitor?
Is it 1440x900 or 1680x1050?
Bcoz your mentioning 2 different ratios here.
My monitor is native 1680x1050 & I don't see 1440x900 listed at all.
My native res is 1680x1050! and i cant use that res
Looks like an EDID problem to me...
Follow #2 and #3 from here: Force DVI/HDMI resolutions and refresh rates
Upload the resulting file somewhere so I can take a look.
What you may need: Male DVI-I to Female VGA adaptor
What you may have to do:
Create an override EDID using #4-... from the above linked post.
Break the highlighted pin from the DVI-VGA adaptor like in this image.
Connect your monitor using your vga cable plugged into DVI-VGA adaptor that is plugged into DVI port of your graphics card. Update windows driver with your custom driver.
I can help you with the software part, the hardware part is up to you (getting the adaptor, breaking the pin).
Problem Solved!
Microsoft puted drivers for my monitor to Update Centar and i downloaded now its works Now i have 1680 x 1050 Now i have Windows 7 X64 (build 7264)
Thanks Microsoft
I have a weird problem. My FX5200 card is capable of dual display. The problem is I can only get one to the proper resolution of 1680x1050. The other will not give me that option instead I have to use an odd setting like 1600x1200.
Anyone have any idea why its doing this.
Also I installed the latest driver and Nvidia control panel is not present on my system for some reason.