Originally Posted by plaplante
An article by Terry Paullin that I found on the net somewhere that my or not explain things!
98 percent of you don’t get it!
Calm down. I an NOT casting aspersions on your intellectual prowess. I am referring, instead, to that magic and evermore elusive Holy Grail, “1080p.”
The digital transition, finally complete as you read this, has caused many folks to re-evaluate their A/V entertainment needs and eventually take the plunge into high-definition television.
While many 720p (720 progressively scanned horizontal lines of resolution) sets were sold, especially in the early days, more recently the alphanumeric “720p” has been the kiss of death for retailers. If it ain’t 1080p, it ain’t S…well, Sellable. This notion is unfortunate, really, because in many applications the difference in resolution is of little or no consequence.
Consider the case of a 40-inch panel viewed at 12 feet—few could tell the difference. There are so many other determinants that make a real difference in picture quality (contrast ratio and colorimetry, for openers), that for many consumers, taking advantage of a 720 “blow-out” sale should be a no-brainer. But for those who must pursue the BIG number, there are some things you should know if you want to be numbered in that other 2 percent.
Let’s start with the obvious. Some folks think that if it says “1080p” on the back of the Blu-ray Disc™ jacket that is what they will get, no matter what. Well, maybe not. If they bought a 720p display, or if they don’t know WHAT they bought, just that it said (legitimately) High Definition on the front panel, they will, of course, never see 1080p on that panel—there is that matter of throwing away the extra 1,152,000 pixels before the set can render an image. (I said obvious!) “But I bought a display (LCD, DLP, Plasma, LCoS—doesn’t matter) with a native matrix of 1080 by 1920,” you say, “Surely, I’m good to go,” Uhh......probably not, I counter.
Let’s start with the myriad of potential problems with settings.
First, most all Blu-ray Disc players have output resolution settings. If you want 1080p, you might have to put it there, as opposed to some other factory-default resolution. Some displays won’t accept 1080p directly, so you may have to set the device that feeds it to 1080i output, letting the display do the de-interlacing (you can never actually watch “i” anything on non-CRT displays—they can only display a progressively scanned signal).
If the BD player is set correctly, the signal’s next stop may be an A/V receiver. Many new receivers have output resolution settings as well, so they may need attention. Ditto if the signal makes an additional stop at an outboard video processor, not uncommon in a modern installation (especially if I did it).
You are clean so far, right? Now come several circumstances that will knock you out of contention for true 1080 without even realizing it.
First is the display “mode.” If you select anything other than a native pixel mapping, aka “dot-for-dot”; “pixel-for-pixel”; or on an LG display, “Just Scan”; you will not see the 1,080 horizontal lines that just came down the pipe. That means if you use “Zoom,” “Letterbox,” “Intelligent Zoom,” or even one click of “Overscan,” often deployed to rid the screen of those pesky little white retrace lines at the top on selective broadcast channels, you’re not getting full 1080. In the case of a front projector, one click of horizontal or vertical “keystone” correction will cause a similar error, robbing you of true 1080. While the visual result may not be blatantly obvious on “normal” content, a simple test pattern will confirm the toxic effect any of the above will have on our precious 1080p.
Calibrators and competent custom installers will be familiar with a test pattern called “Multi-burst.” It shows alternating black and white vertical stripes. First many, then fewer and fewer, until we try to discern one pixel on, one off. Think of it as a frequency-response test for video imagery. If you were to view this pattern in a “dot-for-dot” mode, you would likely see clean, pixel-wide stripes in the high-frequency portion of the pattern. Now introduce one click (often 1 percent) of overscan and watch the pattern get destroyed. The same will happen to an even greater degree with any attempt at scaling (Zoom, Letterbox, etc). Once you see the severe signal degradation any of those adjustments cause, you will know that you have lost the clean 1080p that you so zealously sought.
There are many similar “Gotchas” in the Blu-ray™-influenced, video-processed, HD world we currently live in. While this stuff still isn’t rocket science, a little professional help in the form of test equipment, test patterns, and the more important, experience in their use, is more welcome than ever before to ensure you are “getting it,” in the broadest sense. I walk into situations every day where signal integrity has been compromised by an inadvertent wrong setting, an overlooked default, or even an explored, but misinterpreted menu item. The permutations and possibilities are so numerous that I have no trouble with the alleged 98 percent number. Thinking back, it may be even more of a stretch to imagine that 2 percent got everything right!
I encourage all to take the little extra time (and expense, if necessary) to wring the most out of your investment. Like the constant chant of this magazine, insuring the “Best that it can be” will pay daily, significant rewards. Once you’ve made the investment, the difference between a “looks okay to me” picture and the RIGHT picture is simply a matter of understanding and paying attention to the adjustments.