Quote: Originally Posted by Kari
Here's something not worth own thread but what I want to say. If I misunderstood the point of this thread, I sincerely ask the OP to tell it to me and I will delete this post.
We, household CEO and I, we like to watch an American TV-series "Numb3rs". Just yesterday when watching an episode I was thinking the difference between European PAL and American NTSC systems. (NTSC = Never The Same Color)
I mean, I paid a lot of completely digital high-end Home Entertainment system with FullHD 1080 capability, and then most of the shows I'd like to watch come from U.S.A., so what I see is quite a lousy picture most often in 4:3 format. The quality is often quite poor because of converting NTSC to PAL.
Back there behind the big water, is this Digital Broadcasting you just started going to make a change in my dilemma? May I see better picture quality in American shows in the future?
Kari, the problem is not in the NTSC standard. The problem is in the crappy implementation providers use. The bottom line rules, and the bottom line says "compress the bejeezus out of it as long as it sort of ressembles the original so we can send two or three additional signals in the same bandwidth".
Now with HD, you get compression artifacts that are smaller
. Hell, an upscaled well-mastered DVD sometimes looks better than the station / cable / satellite transmitted HD program.
For true HD, look to Blu-Ray, but that will only bring me to my pet peeve rant, which I may start a thread about ... and no, it's not about the format war, that's over and done and gone. And the image quality is great. No, it's about how can a bunch of multi-billion technological corporations can come up with such a moronic set of UI standards, if standard is indeed a concept that can be applied to it.