Originally Posted by Ken S
Why, what's the point?
I am just speculating about a possible answer to your original question. BTW, how do you know the discs are mastered at 1080i instead of 1080p? If the original music videos were shot in 1080i video then there would be no benefit to a 1080p recording. The Planet Earth, on the other hand, was shot using in the same 1080p/VC-1 encode that Blu-ray uses, so you get the full benefit of the format.
As you know, most hollywood movies are still shot on film (or HD video) at 24 fps. Few people realize that for these movies there is no difference between 1080i and 1080p. 1080p output contains no additional information, but merely repeats some of the same information (3:2 pulldown). The result is the same whether this deinterlacing is done by the player or your TV. See: http://www.digitalvideofestival.com/...&id=1182107778
There Is No Difference Between 1080p and 1080i
My bold-printed, big-lettered breaker above is a little sensationalistic, but, as far as movies are concerned, this is basically true. Here's why. Movies (and most TV shows) are shot at 24 frames per second (either on film or on 24-frame-per-second HD cameras). Every TV sold in the United States has a refresh rate of 60 hertz. This means that the screen refreshes 60 times per second. In order to display 24-frame-per-second content on a display that essentially shows 60 frames per second, you need to make up or create new frames. This is accomplished by a method called 3:2 pulldown (or, more accurately, 2:3 pulldown). It doubles the first frame of film, triples the second frame, doubles the third frame, and so on, creating a 2-3-2-3-2-3 sequence. (Check out Figure 1 for a more colorful depiction.) So, the new frames don't have new information; they are just duplicates of the original film frames. This process converts 24-frame-per-second film to be displayed on a 60-Hz display.
It's Deinterlacing, Not Scaling
HD DVD and Blu-ray content is 1080p/24. If your player outputs a 60-Hz signal (that is, one that your TV can display), the player is adding (creating) the 3:2 sequence. So, whether you output 1080i or 1080p, it is still inherently the same information. The only difference is in whether the player interlaces it and your TV deinterlaces it, or if the player just sends out the 1080p signal directly. If the TV correctly deinterlaces 1080i, then there should be no visible difference between deinterlaced 1080i and direct 1080p (even with that extra step). There is no new information�nor is there more resolution, as some people think. This is because, as you can see in Figure 1, there is no new information with the progressive signal. It's all based on the same original 24 frames per second.