DVI issue with Nvidia 6600GT

Page 1 of 2 12 LastLast

  1. Posts : 5
    Windows 7 RC1
       #1

    DVI issue with Nvidia 6600GT


    Hey all,

    I'm having a bit of an issue. I have a xerox lcd which allows both vga and dvi inputs, a 6600GT with latest windows update drivers loaded and windows 7 RC1. Now, when I use a DVI cable only, the computer will boot up perfectly and the screen will show the windows loading symbol, but as soon as it fades and the desktop should pop up I get a "no signal" message and the monitor shuts off. After much searching I had the idea to plug in a dvi to vga converter to my card which makes everything work fine. The card has 2 DVI ports and I've tried both with the same results(No signal output through DVI but works when I plug in the DVI to VGA adapter in-line).

    I would still like to actually use my DVI for what it is so does anyone have any ideas as to how I can remedy this situation?

    Thanks for the help
      My Computer


  2. Posts : 5,747
    7600.20510 x86
       #2

    Download and install the latest beta driver which works great here on my 6200A-LE.

    64 bit = GeForce ForceWare 186.08 Vista and 7 64-bit download from Guru3D.com

    32 bit = GeForce ForceWare 186.08 Vista and 7 32-bit download from Guru3D.com

    Other than that, check your refresh rate so that it is compatible with the monitor.
      My Computer


  3. Posts : 5
    Windows 7 RC1
    Thread Starter
       #3

    I tried the driver you suggested and the problem still persisted(plugged in the vga adapter to type this)...on top of that for whatever reason that driver does not support my monitor's native 1440x900 resolution so I have to switch back to the window's update supplied one which did. I checked the refresh rates as well and they are ok at 60hz. Thanks though
      My Computer


  4. Posts : 9,582
    Windows 8.1 Pro RTM x64
       #4

    Hi silver2wind and welcome to Se7en Forums

    It could be a configuration setting on your monitor itself. What is the exact model number of it?
      My Computer


  5. Posts : 5
    Windows 7 RC1
    Thread Starter
       #5

    The monitor is a xerox xr6-19dw. I also tried downloading and installing the monitor's official driver but that did not seem to change a thing.

    Here's my thinking so far, let me know if I could be wrong. This isn't a monitor issue because at boot and up until windows desktop is about to show the monitor shows in dvi mode and only then blanks out and says "no input", but if right then and there I plug in the vga adapter it'll work perfectly. For this same reason it also is not a cable issue.

    I tried some odd shutdown routines involving unplugging the vga at a specific time and unplugging the monitor for 10 sec with dvi only in and then restarting everything, but that didn't work either.
      My Computer


  6. Posts : 5,747
    7600.20510 x86
       #6

    silver2wind said:
    at boot and up until windows desktop is about to show the monitor shows in dvi mode and only then blanks out and says "no input"

    Yeah, that's when the video driver is loaded. If you boot into safe mode where no video driver is applied (besides standard vga), I am guessing it would work fine.

    Not sure how to help further, but if it was me, I'd probably clean the system as much as possible of all ATI stuff. Remove drivers, apps, clean registry, find files by hand and delete etc...

    Then reinstall either the previously mentioned beta or WHQL driver from NVIDIA's site.

    Before you do that though, do you have another DVI monitor to try? Perhaps from another system?

    silver2wind said:
    but if right then and there I plug in the vga adapter it'll work perfectly. For this same reason it also is not a cable issue.
    That's sound thinking for the most part, but possibly the DVI uses a wire in the cable that is not necessary for VGA to work when you have the converter going. And that wire is the one that is broken, or the connections of it on either could be broken. Have you checked it out by looking if all the pins aren't bent?
      My Computer


  7. Posts : 587
    Windows 7 x64
       #7

    silver2wind said:
    Hey all,

    I'm having a bit of an issue. I have a xerox lcd which allows both vga and dvi inputs, a 6600GT with latest windows update drivers loaded and windows 7 RC1. Now, when I use a DVI cable only, the computer will boot up perfectly and the screen will show the windows loading symbol, but as soon as it fades and the desktop should pop up I get a "no signal" message and the monitor shuts off. After much searching I had the idea to plug in a dvi to vga converter to my card which makes everything work fine. The card has 2 DVI ports and I've tried both with the same results(No signal output through DVI but works when I plug in the DVI to VGA adapter in-line).

    I would still like to actually use my DVI for what it is so does anyone have any ideas as to how I can remedy this situation?

    Thanks for the help
    .
    Have a look at the monitor settings. Bring up the monitor's menu and look for an option to set the input to Digital. My LCD has an Analogue Vs Digital option.
    Last edited by Victek; 09 Jun 2009 at 17:39.
      My Computer


  8. Posts : 5
    Windows 7 RC1
    Thread Starter
       #8

    Thanks for all the suggestions everyone.

    I uninstalled all of the nvidia drivers and the monitor drivers and started from scratch again but ended up with the same result. It seems I HAVE to use the windows update driver in order to even get the resolution of 1440x900 that my screen requires, the newer Nvidia driver just doesn't have the option.

    I wish I had another monitor to test, but unfortunately I don't.

    The monitor menu makes no mention of selecting the input. There is a separate "auto" button on the screen which I use to switch between inputs when I have my laptop and desktop plugged in (That doesn't apply to this problem as only the desktop is plugged in) and pressing the auto button with only one input in the monitor has no effect.
      My Computer


  9. Posts : 587
    Windows 7 x64
       #9

    silver2wind said:
    Thanks for all the suggestions everyone.

    I uninstalled all of the nvidia drivers and the monitor drivers and started from scratch again but ended up with the same result. It seems I HAVE to use the windows update driver in order to even get the resolution of 1440x900 that my screen requires, the newer Nvidia driver just doesn't have the option.

    I wish I had another monitor to test, but unfortunately I don't.

    The monitor menu makes no mention of selecting the input. There is a separate "auto" button on the screen which I use to switch between inputs when I have my laptop and desktop plugged in (That doesn't apply to this problem as only the desktop is plugged in) and pressing the auto button with only one input in the monitor has no effect.
    .
    FWIW I think the problem is with the screen and not the card/PC. Can you borrow another LCD with a DVI input to test?
      My Computer


  10. Posts : 5
    Windows 7 RC1
    Thread Starter
       #10

    But if it was the monitor why does it work with a DVI signal all the way up until the windows loading symbol shows up?

    Either way I've been wanting to get a dvi to hdmi converter to try the computer hooked up to the tv, I guess now's a good a time as any.
      My Computer


 
Page 1 of 2 12 LastLast

  Related Discussions
Our Sites
Site Links
About Us
Windows 7 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 7" and related materials are trademarks of Microsoft Corp.

© Designer Media Ltd
All times are GMT -5. The time now is 00:22.
Find Us