XFX Radeon 5770 - Blurry


  1. Posts : 3
    Windows 7, 64-bit
       #1

    XFX Radeon 5770 - Blurry


    I recently built a new computer and am having an incredibly difficult time getting the graphics card (XFX Radeon 5770) to function properly with Windows 7, 64-bit (via HDMI).

    Everything looks a bit blurry, especially the text. HDMI, DVI, and VGA connections all look bad. VGA looks a little better, but not as clear as HDMI wiith my Windows XP machine, and I also want to output sound.

    A few notes:
    * Running the the most recent ATI drivers (10.6)
    * Pixel ratio set to 1:1 on the TV
    * Using native resolution (1080p/60hz, also tried 59hz without improvement)
    * Overscan set to 0%
    * ClearType text has been adjusted
    * Tried older drivers: 10.5, 10.4, 9.8, and 9.1 without success

    My 5 year old Windows XP Dell, with a $40 Radeon 4350 card, has no problem producing crisp text via HDMI on this same HDTV. I tried the same 4350 card in my new Windows 7 computer and experienced the same problem as with the 5770.

    The only progress that I've been able to make in narrowing down the problem is that although the output settings are set to be identical in both my XP and W7 computers (1080p/60hz, 32-bit color), my HDTV (Sony Bravia KDL-46w5100) is receiving different signals. I know the signals are different because the TV is treating them differently, interpreting the XP signal as a PC signal and the W7 signal as a video signal. Further evidence that the signals are being output differently: XP display info, W7 display info.

    According to Sony's tech support agent, the TV is actually intended to treat 1080p/60hz computer signals via HDMI as "Video Signals," so technically the TV is treating the XP signal incorrectly, even though the picture quality in XP is outstanding and poor within W7.

    Does anyone have any idea why a 1080p/60hz signal would be output differently in XP vs. Windows 7? Or more generally how to fix my problem?
      My Computer


  2. Posts : 2,528
    Windows 7 x64 Ultimate
       #2

    It seems like it would be entirely under the control of either the ATI driver or the sony firmware as it is a direct hardware to hardware connection.

    THere must be some subtle difference bwtween the signal output that is tripping the TV to go into TV mode now :/

    Is there maybe a menu setting on the TV for the HDMI input to select between "TV" and "Computer" modes? Also, "for fun" have you tried maybe a 720P mode and see if that makes things look any different (other than just "larger" of course) :)

    I ended up ising the RGB outputs on my slightly older ATI and slightly older HDTV and got by far the best signal strangely.
      My Computer


  3. Posts : 3
    Windows 7, 64-bit
    Thread Starter
       #3

    fseal said:
    It seems like it would be entirely under the control of either the ATI driver or the sony firmware as it is a direct hardware to hardware connection.

    THere must be some subtle difference bwtween the signal output that is tripping the TV to go into TV mode now :/

    Is there maybe a menu setting on the TV for the HDMI input to select between "TV" and "Computer" modes? Also, "for fun" have you tried maybe a 720P mode and see if that makes things look any different (other than just "larger" of course) :)

    I ended up ising the RGB outputs on my slightly older ATI and slightly older HDTV and got by far the best signal strangely.
    I agree, there must be a subtle difference between the signal outputs of my XP and W7 machines, despite identical output configurations.

    Unfortunately there is no way to manually select whether the signal is put into TV/Computer mode on the TV.

    I did try adjusting the screen resolutions, on both the XP and W7 machines, and regardless of the resolution the XP box was crisp (remaining to be received by the TV as a PC signal) and the W7 box was blurry (remaining to be received by the TV as a video signal).

    It's so frustrating/infuriating to spend a bunch of money building a new high-end computer and have it not be able to do something as basic as displaying clear text, which my old computer had no problem accomplishing.
      My Computer


  4. Posts : 6,879
    Win 7 Ultimate x64
       #4

    Instead of setting it to 1080/60p, have you tried setting it to just plain old 1920x1080 (Desktop properties section of the CCC)?

    Start here,

    XFX Radeon 5770 - Blurry-settings1.png

    and set it to 1920x1080 here,

    XFX Radeon 5770 - Blurry-settings2.png
      My Computer


  5. Posts : 3
    Windows 7, 64-bit
    Thread Starter
       #5

    stormy13 said:
    Instead of setting it to 1080/60p, have you tried setting it to just plain old 1920x1080 (Desktop properties section of the CCC)?
    No such setting exists. If you look at your screenshot, the highest resolution is 1680x1050; the same is true for me under the "Basic" resolutions node in "Desktop Properties." After 1680x1050, there is an "HDTV" node with 1000i, 1000p, 1080i, 1080p, 648p, and 720p.

    One new curious thing... I just adjusted my resolution to 1360x768, restarted my computer, turned my screen off and on, and while the resolution on my computer definitely changed, the screen input info displays a 1080p signal. I'm not sure what, if anything, this means...
      My Computer


  6. Posts : 2,528
    Windows 7 x64 Ultimate
       #6

    I guess that makes sense. The resolution of your desktop and the HDMI signal don't need to be the same. I don't know if HDMI has required default signalling or not but the card is outputting standard TV resolutions it sounds like. Thaough maybe that is the actual problem. W7 aggressively adheres to resolutions that the monitor reports as being accepatable. So with the HDMI cable in either the TV is telling W7 what resolution to use or it's built in to use "TV" resolutions for output on HDMI. Wherease XP you could set any output on any interface even if it would blow up your monitor :)

    I'm using the RGB outputs which means there is no feedback from the TV so am allowed to select any output resolution I want. The only problem there is, if the TV is off when I reboot, It DOES detect that and actually removes the monitor from my setup and I have to re-add it every time.

    I sure as hell wish there was a "lock" function for video setup

    I have read that there is a way of forcing resolutions, may be in the tutorials section but it may not apply to the HDMI output since it seems like you can already select the resolution but the card is converting it to a standard TV resolution.

    If you have the option of connecting up the RGB output to your tv you may have better luck.
      My Computer


  7. Posts : 7,683
    Windows 10 Pro
       #7

    b00134n said:
    stormy13 said:
    Instead of setting it to 1080/60p, have you tried setting it to just plain old 1920x1080 (Desktop properties section of the CCC)?
    No such setting exists. If you look at your screenshot, the highest resolution is 1680x1050; the same is true for me under the "Basic" resolutions node in "Desktop Properties." After 1680x1050, there is an "HDTV" node with 1000i, 1000p, 1080i, 1080p, 648p, and 720p.
    Hmmm.... reading your statement and out of curiosity I checked my panel and found this...

    XFX Radeon 5770 - Blurry-capture.jpg

    Under HDTV there is a 1920x1080 setting. Make sure you scroll all the way down.
      My Computer


 

  Related Discussions
Our Sites
Site Links
About Us
Windows 7 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 7" and related materials are trademarks of Microsoft Corp.

© Designer Media Ltd
All times are GMT -5. The time now is 00:08.
Find Us