New
#1
720p vs 1080i
My HDTV supports both 720p and 1080i.
For the best quality image what format should I opt for on my HDTV?
My HDTV supports both 720p and 1080i.
For the best quality image what format should I opt for on my HDTV?
Is this LCD or Plasma? From my experiences, I would have to recommend the 720p over 1080i. I'm not sure if you've ever read up on progressive vs. interlaced, so I'm not going to go into all that in this reply. 720p will handle refresh rates better than 1080i with less "blurring" in fast action scenes.
Another question is what would be the content you are viewing? Is it standard definition DVD's or Blu Rays? A standard DVD I would only upscale to 720p max.
Maybe just try both settings and see what works for you.
Last edited by electrotune1200; 18 Jan 2010 at 20:22. Reason: misspell
Hi there
if your broadcaster has it go for 1080i.
Unfortunately a lot of HD these days is Horribly Disappointing as it is often "Upscaled" from normal TV standards.
I don't think there are many HD native transmissions yet in true 1080p which is what you SHOULD use if the TV and broadcast system is available. This is roughly 2 - 3X better than 720p and much clearer than 1080i on a decent TV set.
The HD sevice from the UK SKY TV system I believe isn't too bad but its still only 1080i.
However technology seems to be leapfrogging the available hardware -- the UK's SKY service will be offering 3-D transmissions this year -- with the appropriate TV the quality of this is around 6X better than 1080p - so women if you are on TV -- powder your noses etc etc -- any small blemish can be seen.
The cost of these sets will be a bit beyond me for a while -- ah well there's always "The 6 Numbers..".
If you are in the market for a new TV I'd hold off a bit for the moment -- this should be an interesting year for "the old Goggle Box".
Cheers
jimbo
I have a 56 inch sony plasma that I will be hooking up to a WD TV...
My TV Supports 1080i and 720p...
So I guess I'll be sticking with 720p.
Thanks guys...
I believe WD TV does 1080p output, so I'd go for 1080i conversion, rather than the full downscale to 720p.
Yes, but 720p displays 720 lines every second on the tv.
1080i only displays 520
Hi there
this is one of those arguments where "a little learning is a dangerous thing".
The i in 1080i means Interlaced -- this means generated lines are inserted as well as the actual physical data -- hence the "Upscaling".
The hardware Buffer can hold several lines of data before outputting to the screen so the speed is nearly instantaneous.
The hardware does this very fast using an algorithm which takes data from the proceding line and the subsequent line and constructs the missing data -- usually reasonably accurately.
Large prints from digital cameras often use a similar "Up rezzing" technique to create "missing pixels"when you have say an 8MP digital camera and you need a 30 X 40 inch print for example.
Quality is usually pretty good - remember that most TV programs aren't still shot in true HD quality anyway.
The best way to test is to get a colleague to switch the SAME picture from 720p to 10801 and see if you know which is which - you must be out of the room when he sets the rate. - The picture must be the same otherwise the quality of the source transmission might make a difference.
Don't use an upscaling DVD player as these will set the upscale dependent on the TV signal it detects (HDMI can sense the resolution of the target TV).
The best signals to use for testing these days are Sports - these are usually transmitted in proper HD although in W.Europe true 1080p transmissions are still rare.
Cheers
jimbo
LCD's can't interlace though, they can only show progressive.. so any 1080i signal will still be scaled to 720p on his set (if that's his max resolution). The quality would depend on how good the processor is on his TV. Your mileage may vary, depending on your set, but honestly, on my TV (42" LG), I can't tell the difference between 720p and 1080i.