|Datacolor SpyderTV Display Calibration Kit|
|Home Theater Video Processors & Switchers Video Calibration|
|Written by Mike Levy|
|Monday, 01 January 2007|
Page 1 of 3
Home theater enthusiasts have long known about the need for video calibration. For years, the Imaging Science Foundation has trained professionals to return your set to the standards that filmmakers and broadcast companies expect. Yet the average consumer sometimes might ask why there is a need for video calibration at all. Why don’t video companies with vast resources set the latest and greatest HDTVs correctly at the factory? While most manufacturers have improved their picture quality since the ISF and Joe Kane made us all aware of the necessity for calibrating our monitors, most HDTVs still need calibration out of the box. They are usually set up to deliver the most “impressive” image, not the most accurate. Imagine that you are trying to sell HDTVs under the giant sodium lights at Costco, not in the light-controlled rooms of a high-end custom integrator – you might bump the brightness a little to move some boxes, even if you knew the set might last three years longer and look better at home with a different set-up. SpyderTV is the first solution I know of which attempts to eliminate the calibrator by instead having the computer do the job. The question is, “How do the results compare to what a trained calibrator would do in the field?”
The SpyderTV Datacolor, which retails for a manageable $699, is a computerized method for calibrating your monitor, TV or projection system. When you open the box, you’ll find three DVDs, a manual and a sensor that connects to your computer via a single USB cable. One disc (Spyder 2 Pro) is for calibrating the monitor on your computer, and the other two discs (SpyderTV Pro and test patterns) are for calibrating any TV or projection system in your home. It’s very simple and in many ways a very refreshing idea that is long overdue.
A Brief History On Video Calibration
Back in 1987, when a 25-inch TV was considered large, I discovered Joe Kane in a small room at the Chicago CES displaying his new laserdisc, entitled A Video Standard. Kane explained how it would improve and standardize the image on any monitor by allowing it to be calibrated to the industry reference standard. The televisions that were available at that time were sold with their settings calibrated far off of the standard, with each manufacturer either intentionally or accidentally presenting a color pallet unique to each set.
Intentionally? Why would a manufacturer intentionally set the color incorrectly? Well, now hear my treatise on Lowest Common Denominator marketing. As Kane explained, Sony had done a test where they had average consumers choose the image they preferred from several identical sets that had their color calibrated differently. Sony had varied the color temperature.
What Sony realized through their tests was that they could not make the image blue enough for the average consumer. Sony wisely changed their calibration settings and their sales soared. When the industry found out what Sony had done, other manufacturers followed suit and that was the end of the 6,500K standard. The only problem was that, while the average consumer loved it, movie enthusiasts wanted the colors that directors intended on their films.
What Kane showed me on that day was historic for our industry. A series of events culminated in Joe Kane and Joel Silver creating the Imaging Science Foundation. The need for a reference setting that is close to the standard is now recognized throughout the industry. Today, most sets at least have a setting available that is supposed to be accurate.
Still, there is a need for ISF calibration because there is much more to setting a monitor than just color temperature. The reference black level, gray scale and white level are also important, along with a factor called gamma, which dictates the details of how gray scale (and thus all brightness levels) is presented. There are settings for detail and settings for color decoding. All have needed a trained technician to be set properly – that is, until SpyderTV came along.
Television and film work basically the same way. They use the additive color triangle. By using red, green and blue as the primary colors, any visible color can be created. Of course, the devil is in the details. In order to produce all of the colors, the primary colors must be as far toward the corners of the CIE chart as possible.
What Is Color Temperature?
Say you were to take a steel bar (we use what is called a theoretical black body) and you put it in a fire. When it gets hot enough, it will glow. At first, it will be a faint deep red glow, but as it gets hotter, it will go through all of the colors from red to blue. With this information, we can create a graph of which temperature creates which color. Using the absolute temperature scale (Kelvin), we find that the point where equal amounts of red and blue are created is at 5400 degrees Kelvin. This is the reference color for white used in film. All other colors are referenced from this point. Using a triangular graph representing the three additive primary colors, red, green and blue, this point is dead center. For television, the NTSC reference color temperature for white is 6500 degrees Kelvin, which is slightly blue. Keeping to the reference is simple in theory, but very hard to achieve in reality. For consumer CRTs and plasmas, there is a tradeoff between brightness and depth of color that makes a balanced choice necessary. There have been advances in phosphor technology, but achieving the reference primaries for HD is not easy and, because of the tradeoff with brightness, they are rarely used in consumer sets. The same is true in LCD monitors and digital projection systems such as DLP, LCD, LYCOS and SXRD. There the bulb’s spectrum comes into play and the depth of the primary colors after filtering is altered. Very few consumer projectors use bulbs with a spectrum that is wide enough or smooth enough to achieve the reference primaries.
This presents a problem for SpyderTV, since the entire color pallet is changed when you shift the primaries. If the set uses the wrong reference primaries, then every color is changed. If the color temperature is left well above 6500K, as it is on almost every consumer set, then everything has a cast of blue on it, such as bluish sand and bluish faces. The average mass-market HDTV is calibrated to 8500K, with many ranging over 10K at the moment you crack open the box.