Could 1080P be the “final” home video format?

On our last show (Episode 043), we discussed watching The Dark Knight, especially the parts that were filmed with IMAX cameras. Bryan noted that while watching on his friend’s 1080P television, that there was a noticeable difference in the picture quality when the film transitioned from the 2.4 widescreen to the IMAX portions. That really surprised me, as you could tell on the podcast (or you will be able to tell when you listen – it looks like this article will be posted before the show, which is good, because I am a bit embarrassed to be writing this and am glad it will get pushed down the page). The reason I sounded so surprised is because I had assumed that there would be no difference in the actual picture quality between the IMAX and the standard film portions of the Dark Knight.

What follows is an explanation of why I assumed that the images would be of equal quality, and some theorizing about the implications of my assumption being incorrect. This is very, very fascinating to me, but I promise not to have my feelings hurt if you don’t click the link to read more.

When assessing the maximum visual fidelity available, one has to consider all of the components that go into displaying the image. These components form a chain, shown here:

Original Film : Media : Player : Cable : Monitor

The visual fidelity one can expect can never exceed the capacity provided by the “weakest” link in the chain. For example:

35 mm (standard movie film) : Blu-Ray : PS3 : Component Video Cables : Standard Definition Television

In this example, the SD television is the “weakest” link in the chain. The visual fidelity will be limited by the televisions ability to render the image. This is why I would never advise someone to buy a Blu-Ray player if they weren’t planning on buying an HDTV.

35 mm (film) : Blu-Ray Disk : Blu-Ray Player : Composite Video Cable : 1080P TV

Here, the composite video cable is the weakest link, limiting you to an interlaced SD image. This is why God gave the world geeks, so we can assist with the proper connection of devices.

70 mm (IMAX) : DVD : PS3 : HDMI : 1080P TV

Here, the DVD is the weakest link in the chain. Although a setup like this would allow one to enjoy a DVD to its maximum potential, you will still have a standard definition image. That is why God made Blu-Ray players!

35 mm (film) : Blu-Ray Disk : PS3 : HDMI : 720P Television

70 mm (IMAX) : Blu-Ray Disk : PS3 : HDMI : 720P Television

These chains show my viewing experience watching The Dark Knight. Since the 720P TV is the weakest link in the chain, the picture I see would be limited to what that TV can render. Since everything else in the chain exceeds my TV’s capacity, the total visual fidelity should be equal, whether viewing the portions filmed with IMAX or with standard film stock. Bryan’s assumption that a difference would not be visible on a 720P TV is absolutely correct.

35 mm (film) : Blu-Ray Disk : PS3 : HDMI : 1080P Television

70 mm (IMAX) : Blu-Ray Disk : PS3 : HDMI :1080P Television

These chains show Bryan’s viewing experience watching The Dark Knight. If there really is a difference in image quality, than it indicates that the 35 mm film is the weakest link in the chain, which means that capacity of a Blu-Ray disk to present picture information exceeds that of 35 mm film.

The idea that a Blu-Ray could store an image that eclipses the image contradicts everything that I had heard or read about 35 mm film. My understanding was that the capacity for 35 mm film to store picture information far exceeds a 1080P digital signal. If a Blu-Ray disk was really capable of exceeding the image presented in all (non-IMAX) movie theatres one would expect to have heard things like “the capacity for better than movie theater quality.” Estimates I have seen indicate that it would take between 3 and 12 million pixels to capture all of the picture data provided on a 35 mm film. This means that a 1080P image only capures somewhere between 16% – 66% of the picture that standard film stock provides.

At this point, you can see why I was surprised that there could be a difference. Based on what I knew, seeing an actual difference should not be possible. If 35 mm film truly is better than a 1080p signal, than having 70 mm film to start the chain wouldn’t make a difference.

But, I do trust Bryan (except when it comes to objectively assessing games with “Halo” in the title), so I did some more research, watched the section of the movie that he was talking about and did some more theorizing.


In the category of “the difference is only in his head”:

In the section that Bryan discussed, Jim Gordon’s face will clearly show more detail in IMAX , because it is simply larger. On my screen, his head was about 15” tall during the IMAX section, and in the standard film section that immediately followed, it was about 12”. This may not seem like much, but on a 1080P television that would represent somewhere around ten thousand pixels (possibly more, but I am already too nerdy – I am not about to do the math on that).

The IMAX portion completely fills the screen, while the standard portion fills significantly less. That means that there is more picture data being shown, and that is going to look better.

Other difference is lighting and shot composition could result in better looking images captured on the IMAX portions.


In the category of “there really is a difference … therefore 1080P may be the gold standard for many, many years):

Even though it may take 3 to 12 million pixels to completely capture all of the picture data in a 35 mm film, at that extreme of a level, some of that data is going to be blurry, grainy, or otherwise not conducive to presenting a pretty picture. The 70mm IMAX film (four times the image capacity) could certainly grab additional details that do manage to make the transition to the 1080P image.

In low light situations (like the scene mentioned in our podcast) film becomes grainer and captures less total picture information. This would exacerbate the issue described above. In this case, it is possible that the 1080P image could eclipse the picture captured on standard film.

Since there is four times the surface area on the IMAX film, it may be easier to capture clean images for the digital transfer.


So, if you have made it this far, let me know. I am very interested in the experiences of other people watching IMAX images on a top of the line video setup. I am leaning toward the “there really is a difference” side of the issue…

If 1080P really can keep up with (and exceed) 35 mm film, that is great news (for our pocketbooks!). It means that Blu-Ray will be the physical media standard for many years to come (35 mm film has been the standard for over 50 years in the movie business, and no one complains that movies in the theatre need better picture quality…). It would also mean that we are really close to having digital distribution of the best quality images we can perceive. I have read about the next level of HD, which requires multiple terabytes of data, but if we are already exceeding film, we are already set to go.

This could mean that 1080P images will do for movies and television what MP3s did for music. Provide a digital version of something that provides end users with as much quality as they have the capacity to perceive and enjoy. After all, MP3s contain only a fraction of the sound information stored on a SACD, but no one cares. In fact, I’ll bet that most folks have never heard of the SACD format…

, , , , , , , , , , , , , , , , , , ,