New
#20
good riddance. Might as well be using a rotary telephone.
good riddance. Might as well be using a rotary telephone.
Monitors are chicken feed compared to projectors. Like many other companies we have more than a dozen here and I would describe them as exactly the opposite of cheap. If PCs aren't going to have any VGA-outs anymore then each and every single one of those several-thousand-dollar projectors are going to be rendered into very expensive paperweights.
Correct me if am wrong DVI is an analog signal whilst HDMI uses a digital signal, thus it provides audio. I cannot see a quality difference at all between VGA, DVI and HDMI. VGA is not dead to me but does seem out dated not having hdcp support.
I may be wrong but that's what I have read in the past
DVI and HDMI are practically the same, just HDMI supports audio as well. I'm all for phasing out analogue and having digital. Even though DVI isn't mentioned, I don't think they're going to phase that out. Though if they do, then a simple HDMI-to-DVI converter is all that's needed. Not so simple doing VGA-to-HDMI as HDMI doesn't support analogue at all, unlike DVI.
Am I the only one that pictures 3 guys in a dark room, one wearing an Intel shirt, one a AMD shirt, and the third guy handing them a paper saying KILL VGA?
~Lordbob
DVI can provide digital and analog signals, thats why you can use a DVI to VGA adapter on some video cards. There is DVI-I, integrated, analog and digital. DVI-D, digital only. And DVI-A, analog signal only. You won't see many DVI-A, analog only connectors. Up until recently most video cards had DVI-I outputs, and the monitors have a DVI-D and maybe a VGA input.
When I got my new Win 7 computer I was looking forward to using HDMI with my 32" Sony Bravia and so I bought a new high quality Belkin Pure AV Silver Series HDMI cable, but I was in for a big surprise!
When I connected my computer to the Bravia using the HDMI cable the image was so severely overscanned that I could barely even see the window controls in the upper right hand corner of the screen and no matter how I adjusted my AMD 5570 graphics card to return to the 1366 X 768 screen size the Bravia displayed hard to read text and distortion artifacts across the entire display. What I later discovered was that HDMI has been mandated to output only 720 or 1080 lines of resolution and correcting the overscannresultec in distortions that appeared no matter what method I used to adjust the resolution to to fit the 768 line screen resolution of the Bravia.
This distortion of the adjusted screen resolution manifests itself not only in difficult to read text, but also in a matrix of lines which look like cracks in the surface of every picture. I went back to using a DVI to VGA converter and now have a perfect 768 lines of resolution without the overscanning or the very distracting distortions created by using the mandated 720 line resolution of HDMI.
That said, if anyone has a solution for converting HDMI's manditory 720 line resolution HDMI into 768 distortion free lines of resolution that perfectly fits the screen using the Bravia's HDMI port I'd be more than happy to hear about how to do it because that fine Belkin Pure AV HDMI cable is currently just lying around unused...
~Maxx~
.
Last edited by Maxxwire; 19 Dec 2010 at 19:59.