Log in

View Full Version : Progressive versus Interlaced: 720p versus 1080i


Jason Dunn
05-14-2004, 09:00 AM
<div class='os_post_top_link'><a href='http://alvyray.com/DigitalTV/Naming_Proposal.htm' target='_blank'>http://alvyray.com/DigitalTV/Naming_Proposal.htm</a><br /><br /></div>"This depicts how progressive (P) scanning works - see top row - versus how interlaced (I) scanning works - bottom row. People (such as Congressmen) read "our" 720P proposal and "their" 1080I proposal and assume that 1080I is superior because the number is larger. The diagram above shows that this is a fallacy. In the progressive system, 720 lines are presented to the human eye every 1/60th of a second while the so-called 1080I system presents only 540 lines. Therefore a more accurate name for the interlaced system is 540I. It also more accurately represents the fact that the interlaced system is lower in quality than our 720P system. Not only is 720P superior to 540I in quality (note the well-known artifact: interlace flickers), but it is cheaper! And it is naturally compatible with computers (which all use progressive scanning). Now you can understand the title of this page: It is 720P that is greater than 540I - formerly known in old-speak as 1080I."<br /><br />A very informative site that is a must-read if you're confused about HDTV resolution and what progressive and interlaced mean.

Jon Childs
05-14-2004, 03:31 PM
On a real big screen TV (mine is 120" diagonal) the 1080i is much better for the vast majority of programs. Watching a football game is simply awesome at this resolution and size. I can hardly notice any artifacting at all. We recently starting getting Red Sox baseball games in HDTV too. You can notice a little artifacting when watching a pitched ball, but it not really distracting at all.

Jason Dunn
05-14-2004, 03:37 PM
On a real big screen TV (mine is 120" diagonal)

What?? They make TVs that big? 8O

dustindw
05-14-2004, 04:42 PM
This article seems to be a bit biased... renaming 1080i to 540i is ridiculous. The naming standard suggests total resolution, so it would defeat the purpose.

Anyone that has ever watched 1080i and 720p would know that there is really no visible difference between the two. You go into any electronic store (that is actually showing HD content) and look at a 1080i (typically CRT) set, and then go look at a 720p set (typically DLP or Plasma) and tell me you see a difference.

I have a 65" Mitsubishi CRT, and I have never once seen the "interlacing" effect when watching 1080i or 720p (upconverted to 1080i) signals.

This guy seems to be a bit off his rocker. Calling 1080i 540i would not do the 1080i format justice. I'm for both 720p and 1080i formats because I am a fan of DLP and Plasma technology... but my 1080i CRT still blows my mind when it comes to HiDef content. I'd like to know if anyone else can actually see interlacing on a 1080i set.

- Dustin
- www.superinsane.com

James Fee
05-14-2004, 05:29 PM
Its a weird article. :drinking:

A good 1080i or 720p feed will look great on any set that can support it. The biggest problem I feel with the feeds is multicasting. Our ABC multicasts and Monday Night Football looks crappy compared to CBS on Sunday. I know 720p looks good because ESPN-HD feeds look great. A friend of mine who works for ESPN says that 720p is much better at moving pictures than 1080i and I'd tend to agree with him. PBS-HD looks just great when they show those landscapes.

BUT, this is all a moot point until the broadcasters start taking the quality of the pictures seriously.

klinux
05-14-2004, 06:19 PM
1080p is the only way to go! :)

Felix Torres
05-14-2004, 07:11 PM
The fallacy of the article is that it assumes that the amount of data transmitted is equal to the amount of data displayed.
It isn't, of course.

At risk of being as simplistic as the author, I'll point out that with modern buffering and image processing you can extract more meaningful data for display out of the 1080 signal than you can out of the 720 signal because you can interpolate in both spatial and temporal domains. 60 half fields at a higher resolution do contain more data than 30 full fields at a slightly lower resolution. It just needs some processing to coax it out of the signal.

As a result, what can be displayed from a 1080i signal approaches the amount of data that would be available in a 1080P signal and, by his own rules, that is *supposed* to be superior.

Of course, this neglects to factor in the quality of the original signal and, more importantly, the native resolution of the display.

Even if 720 were clearly superior, trying to display such a signal on a display with native 1080 resolution with result in artifacts that would not appear from a proper 1080 signal.

And with half the major networks lining up behind 720 and the other half behind 1080, it ultimately doesn't matter which is better; half the time you're going to be watching less-than-optimal but still superb imagery. This will continue to be an "issue" until we can all get super-advanced displays with native resolution of 3840 by 2160. (do the math, folks!) >;-)

So it really doesn't matter which signal is better because you're not going to see the image displayed perfectly any time soon and any degradation in display will exceed the inherent difference between the signals.

Me, I'd be more interested in knowing how good the image processor circuitry of a given display is than its rated native resolution.
Cause a bad image processor will wipe out any gains of going from ED to HD and then some.

Dr. Odd
05-17-2004, 05:43 PM
The author does seem uninformed and perhaps a little biased. I work in television, and adding fields to full-frame computer generated images is part of my job. I want to address a few points from the article:

1) "540I is incompatible with computers." Not true, of course, or we'd never have seen Toy Story broadcast on American television. Computer programs like After Effects can very easily add "3:2" fields to sequences of full frames. I will agree that it would be nice if we could avoid this step to save time, but there's certainly no compatibility issue.

2) "Current digital television (...sometimes referred to as "D1") has been called 480I up until now." Do not confuse professional "D1" resolution with consumer-level "DV" resolution: D1 is 720 x *486*, not *480*. DV is 720 x 480. A minor mistake, but one that makes me consider the author may be getting most of his information secondhand, and getting confused.

3) "The interlaced system is fundamentally flawed because of its flickering." In theory this is true: with interlacing, the alternate fields do flicker back and forth. But how many people notice this with their current TV sets? Not many, because of the way our eyes work: At 24 and 30 frames per second, the eye cannot distinguish between individual frames, seeing only fluid motion. Displaying 2 fields every 1/30th of a second is almost impossible for the naked eye to see.

4) "Our system delivers 720 lines at each instant; theirs 540." The author seems to be implying that interlacing removes half of the data from the broadcast image. This is not true: it simply takes twice as long to draw the original single frame image. But at 60 fields per second, the entire image is still drawn within the 30 frames/second needed, well within the 24-30 fps required to fool the eye into seeing fluid motion. The net result is that the human eye will see more resolution at 1080i than 720p. Felix Torres was absolutely correct about this in his post above mine.

5) "Side-by-side with 720P and 540I (well, 1080I, if you insist) systems...480P looks better!" At this point the author is no longer contrasting the two high-def formats with each other, but is instead comparing *both* against current TV resolution (albeit progressive), and stating they fall short. It is ridiculous to claim that 480-progressive looks better than 720-progressive. The author says that it is because all high-def televisions suffer from the "inability to deliver high brightness". This is a consumer-end hardware issue, and it has nothing to do with distinguishing between high-def formats.

So why bring this last argument up? The author goes on to state that "480P is affordable; 540I is not - by either broadcasters or consumers." Is this the real issue behind this article? That it will cost broadcasters money to upgrade their equipment for high-def, regardless of whether it's 720p or 1080i? Does he have a financial stake in broadcast television?

If this is the whole reason that the author wrote this article, then I would suggest that he find the brightness control on his television and turn it up. Spreading misinformation in an attempt to prevent the move to high-def broadcasts is doing a service to no one but his own self interests.

Jon Childs
05-19-2004, 03:47 PM
On a real big screen TV (mine is 120" diagonal)

What?? They make TVs that big? 8O

I guess TV is not quite correct. It is really a front projector with a pulldown screen. It really has been a great addition to my basement. With a 2 year old and a 4 month old I don't get out too often.

Suhit Gupta
05-19-2004, 06:59 PM
On a real big screen TV (mine is 120" diagonal)
What?? They make TVs that big? 8O
I guess TV is not quite correct. It is really a front projector with a pulldown screen. It really has been a great addition to my basement. With a 2 year old and a 4 month old I don't get out too often.
What's the quality like? Do you experience problems if the room/ambient light to too bright?

Suhit

Jon Childs
05-20-2004, 12:03 AM
On a real big screen TV (mine is 120" diagonal)
What?? They make TVs that big? 8O
I guess TV is not quite correct. It is really a front projector with a pulldown screen. It really has been a great addition to my basement. With a 2 year old and a 4 month old I don't get out too often.
What's the quality like? Do you experience problems if the room/ambient light to too bright?

Suhit

The quality is awesome. HDTV stuff is better than the movies, especially a film that has been played a bunch of times at the movies. Given the proper circumstances no rear projection TV can hold a candle to it.

It is in my basement so room/ambient light isn't a big issue. I have found it can stand up to a little ambient light, but I wouldn't recommend it for a first floor family room with lots of windows.

Wilbert
05-20-2004, 04:20 AM
And with half the major networks lining up behind 720 and the other half behind 1080, it ultimately doesn't matter which is better; half the time you're going to be watching less-than-optimal but still superb imagery. This will continue to be an "issue" until we can all get super-advanced displays with native resolution of 3840 by 2160. (do the math, folks!) >;-)

So it really doesn't matter which signal is better because you're not going to see the image displayed perfectly any time soon and any degradation in display will exceed the inherent difference between the signals.

This is 100% TRUE. Most people walk into a Best Buy and think they are getting a "HD" when in fact they are just getting a TV that can recieve the 1080i or 720P and then reduce the image to its resolution. 1080I or 720P is not the same as 1024x768. The whole numbering system for TV's is out of wack. The fact os there is NOT a single true HDTV on the market unless you want to spend $20,000 The only thing that can display TRUE HD that is close is Apple's HD Cinama Display. It has the resolution to do it.

Plus, the other side of this is that most TV stations "upconvert" their image anyways. I do not thing we will ever see true HD as we once thought we would. Yes, we have better picture quality, but TRUE HD looks like a picture you hold in your hand or hang on the wall. I have seen it and it makes your jaw drop. But I do not think we will ever see it becuase consumers will settle for price over quality.