If the subject of this post doesn’t make any sense you probably want to skip this. I could’ve Googled about this topic but I think that may have prevented me from writing about it – any chance to be more prolific!
Recently, a friend of mine gifted me a DVD (to mark another year closer to death, if you want to put it like that) and I was amazed by the visual quality. Not because it was Bluray; it wasn’t. There is a Bluray version, and it was likely originally recorded in that format, with the standard DVD being a standard ‘descale’ of that.
I haven’t invested in a Bluray player as I don’t really watch much. I have a 10yr-old DVD player that still does the job and will continue to until it blows. But, I was kind of blinking, as, compared to older DVDs, when High-Def wasn’t around, there wasn’t much graininess (viewed on flat screen); there was a kind of pseudo-HD clarity.
I took this disc around to my friend (the guy who gifted it to me, and who owns a Bluray’d laptop that he HDMI’d up to his TV) and I didn’t tell him what I thought of the visual clarity. But after a few mins he said “that’s nearly high def!”
My guess (and surely some article confirms this; I may read up on it later) is that HD content descaled to SD, because you have so much clarity to begin with, loses not so much in the down-conversion, so still looks good. While SD originally recorded as SD comes from a poor quality first source, and won’t even look good on up-scaling players.
SD looks good on old CRT (tube) TVs because of the kind of visual enhancements they can pull off. Fewer people have these now, so I wonder, while HD is still undergoing full adoption, if it’s worth exploiting sub-HD as a tag? Couldn’t HD to SD converted DVDs use that tag? Couldn’t HD to SD broadcast content be aired that way? I suppose the problem is if you can deliver a half-way cheaper solution you put off people from going full HD.
I’m pretty sure my eyes didn’t deceive me anyway. HD descaled to SD certainly looks better than primary-sourced SD, to me.