Hi, I'm wondering how ONLY the frame rate affects the look of a video. I mean, you can really tell the difference in the motion blur of a (24 fps) 1/48 footage and, lets say, 1/500; so it makes sense that a 24 fps video has more motion blur than a 30 fps one, because of the 180º rule and so the increase of the shutter speed. But if you isolate the shutter speed thing and just take the frame rate variable, how is that affecting the “motion look” of your footage? I run some tests shooting 24 fps 1/60 and 60 fps 1/60, and I couldn’t see anything different (should I?...). So: why 24 fps, that we all got used with Hollywood films, looks more “filmic” than 30 fps? Is only the shutter angle or the actual frame rate too?
You should have seen a VERY noticeable difference between 59.94 and 23.976 footage. Since we don't know what you used to play back the two clips -- hardware AND software -- it's impossible to make informed comments.
KGAN (CBS) & KFXA (Fox) Cedar Rapids, IA