(I put this question up on the editing forum as well, but figured cinematographers might also be able to give me some good info. . .)
Are there any practical reasons for recording (or editing) interlaced videos these days, particularly since just about every (consumer) uses progressive scanning? Or is it just something of a relic left over from the old CRT days. . .a 'transitional phase'?
It depends on the look you are going after. Progressive has a very different look than interlaced footage. Depending on what you want and the look/feel/mood of the piece, you'll choose one over the other.
Typically, the world seems to be moving to 24p, as it is much more filmic and the motion effects you get at 24fps seems to add drama. Interlaced footage such as good ol' NTSC (or 1080i) has much more of a "live" and instant look, that puts the viewer somewhat in the scene, whereas progressive tends to remove the viewer from the scene.
I forget which, but there was one of the national soap operas a while back (oops, I mean "daytime dramas") that switched over and started production in 24p. They thought it looked much prettier and more filmic, which I'm sure it did. Viewers were outraged and there was a huge backlash. Viewers now felt that they were more watching a movie, not truly "in" the live instant-looking "video-y" show they had been used to. The production was switched back to interlaced.
It just depends on the look you want.
Fantastic Plastic Entertainment, Inc. fantasticplastic.com
I think I know what you're talking about, somewhat, at least when it comes to framerates. . .
Personally, people keep talking about that 'film look', achieved somewhere between 24 fps and the types of lenses used--i.e. those with shallow depths of field. Actually, it's something I think is distracting when used in video, since there's a certain conspicuousness to it--as if the videographer/producer, focusing in and out on various subjects, are trying to advertise how filmic everything is looking.
But, worse than that, and on the inverse, is the way a lot of the new flatscreens have this 240-Hz function (or whatever it's called) that really cheapens the appearance of whatever movie they're playing. Retailers (that will remain anonymous) love to boast of this on their home theater displays, but to me it looked like crap before I even knew that there was anything different going on. It looked like cheap video shot in the 80s. . .like a soap opera.
I believe the same happened to the MTV awards one year being shot in 24p.. loads of complaints.. that it didnt look live..?
Does seem to be some physiological conditioning .. or just what people have got used to seeing movies /drama and sports events.. looking different..?
Do you know what year the awards were? I'm asking because the 2009 awards--where K. West became infamous--were what made me appreciate HD. I mean. . .watching Lady Gaga's performance of "Paparazzi". . .it all made sense! (Yeah, I know. Cheesy. Oh well.)