You are caught in the "must have the best" paradigm, which is stupid. 1080p and 1080 i are the same, and they are probably both better than your eye can resolve. One you have reached the limit of the human body, there is no reason to go farther.
This is false-hood. If your firend stands next to your 1080i TV can you see more detail in his/her face than in the face of the news-caster shot in HD on the screen? Do the strands of hair look more "real" on your friend than they do in the TV screen image?
Your eye can see much more detail than 1080i/p. Thinking this isn't the case is as silly as suggesting that the human eye can't see the difference between a 300 and 1200 dpi printer. We shouldn't have to wrestle with psychological issues of "the upgrade bug" to merely have an accurate discussion about what the human eye can plainly resolve for anyone with normal vision.
Having said that, one does reach a point of diminishing returns given the inherent resolution of the source material being delivered (like a 1980's flat-shot film on grainy film-stock) *and* the limits of the display chain in a consumer's living room (ie, sitting 3 screen widths away from an HDTV won't reveal all the detail that's actually in the 1080 picture), but those are separate issues to the primary question.
Also, there is an enourmous difference between 1080i (live shot interlaced) and 1080p (live-shot frame). Just as there's a difference between 480i NTSC video and 480p progressive-scan video. Do a bit of research into interlacing and deinterlacing algorithms if you're curious.