So this is one of those questions that goes back and forth on the boards, and just need some help understanding this concept.
I understand that drop frame timecode doesn't actually drop frames, but in order to compensate for the fact that certain frame rates don't run at the same speed that would match a wall clock if counted without drop frames, the first two frames of every minute excluding 10-minute intervals are skipped when counting frames.
This makes perfect sense to me for 29.97 video. Ok, great.
Now, on the other hand, we have the fun of "24p" video, which, as I understand it, actually runs at 23.98 fps. What's fuzzy to me is why, then, 23.98 video is counted using non-drop frame timecode. (I know this because in Final Cut, with 23.98 formats, it doesn't even allow you to select drop-frame. This corroborates what other people have said in other posts as well about how video cameras use non-drop frame for their 24p modes).
And while we're on the subject, is 23.98 fps video not an even number for the same reason that 29.97 fps video isn't?
And if so, how can non-drop frame timecode actually represent time accurately?
Hope I've been articulate enough in asking, I know that a lot of people who have questions on this topic get flamed for it... But I'm just trying to wrap my mind around this thing.