ADOBE PREMIERE PRO: Tutorials Forum Articles Creative Cloud Debate

Unravelling a web of interlacing (a question of interpretation)

COW Forums : Adobe Premiere Pro

<< PREVIOUS   •   FAQ   •   VIEW ALL   •   PRINT   •   NEXT >>
Daniel James
Unravelling a web of interlacing (a question of interpretation)
on Aug 9, 2018 at 11:34:10 am

Hello! I am trying to wrap my head around a rather convoluted problem of my own making. I am attempting to "remaster" an old project that was done (by me) a few years ago, in an effort to slightly improve its image quality. There were no obvious problems with the project at the time and I was happy with it but I think it can be improved with a few tweaks to settings. I still have the project files exactly as I left them. Forgive the lengthy backstory but it's necessary to get the context. Here are the details:

Broadly speaking, the project is a VFX-heavy film, completed in standard definition only. It was intended for online streaming and DVD (it was never intended to be HD and I have no intention of upscaling it or anything like that), and was mastered in a progressive format from within a progressive sequence.

None of the original footage was interlaced (I'll get to that in a moment), it was all shot at 25fps progressive, and the project was finished at 25fps progressive (720x576 anamorphic widescreen).

The film contains a mixture of live action footage (shot as described), augmented live action footage (where short clips have been exported to After Effects as lossless, been manipulated in some way, and exported back into Premiere as a new clip), and entirely fabricated sequences (created in After Effects). The augmented and fabricated sequences were exported at the project's native resolution, 720x576 @25p, although here and there I have done a bit of manipulation within the Premiere sequence itself - such as speeding up some clips, or doing slight digital zooms.

Here's where it gets a bit complicated:

The live action footage was *originally* captured on a DSLR camera at 1920x1080 at 25fps (not 50i, specifically 25 full frames per second). Since I had no intention of doing an HD project back then, I decided to convert all of the footage first (hundreds of .MOV files from the camera) into a single .AVI file, in DV PAL format. This, I thought, would be perfectly adequate for a standard definition project, as well as being faster for editing. I tested the conversion of some footage and was happy with how it looked, before converting the rest of it in its entirety... and then deleted all the original .MOV files to clear some much-needed space from my hard drive. Yeah, kinda regretting that now, but moving on.

The trouble is, this conversion was done so long ago that I don't exactly remember *how* I did it, and I'm pretty sure I didn't know what I was doing at the time either. What I think happened was this:

All of the 1080p .MOV clips were dragged into a blank Premiere Pro project. I created a new sequence using the PAL Widescreen setting (which, by default, would have been interlaced, Lower Field First, I believe?) and dropped all of those .MOV clips into the timeline. I then RE-SIZED each of those .MOV clips within the sequence, to... approximately 53.4% or thereabouts - until they filled the frame of the 576 sequence. Then I exported the entire sequence as a PAL DV 720x576 AVI file. Again, by default, Premiere sets exports to Lower Field First, so that is almost certainly how it was exported. That .AVI file is what I'm left with and what I'm trying to interpret and improve (or at least wrap my head around).

Now, I have been reading up about the differences between progressive and interlaced, and although I understand the hows and whys in a normal situation, the actual mechanics behind is a bit of a mindbender. I can't determine whether or not this .AVI file that I converted years ago (from a collection of 25fps clips) contains interlaced or progressive "footage", or how such a difference would manifest, or whether it's even important. I can use a program like MediaInfo but this is all based on how the file is "flagged", I believe. Unless I've misunderstood how images are stored within a "video file" (ie. a sequence of images), there is no structual difference between a file that is progressive and one that is interlaced. Any difference occurs at the point of INTERPRETATION, which is decided by the operator on the basis of the SOURCE of the footage (ie. how it was captured).

In other words, if you shoot something interlaced at 50i, you will still have a video file that is 25fps, but each frame of that file will contain temporally misaligned "lines", hence "combing" artifacts when viewed without field separation. Telling the video editor "this footage is interlaced, it needs to be separated" will take those odd and even lines and split them into new frames, making a sequence that has double the framerate but each frame has only half the resolution. The remaining resolution (the missing lines in each frame) would be interpolated, filled in by the computer so that you can view it on your monitor. Right? That's why when you have footage from an interlaced SOURCE, you should field separate it, otherwise you get combing artifacts.

So, if you had a progressively shot sequence, the computer still sees it as a 25fps video file but all the odd and even lines are from the same moment in time, so they don't exhibit combing. As far as the computer is concerned, there's no difference in the type of data. If you told the editor to interpret this file as interlaced, it would take those odd and even lines, split them into new frames, and interpolate the missing lines in each frame, hence reducing quality but needlessly doubling the framerate (needless in this case because the fields represent the same moment in time, there's no additional temporal information in them). Moreover, if working in a 25fps environment anyway, those duplicate frames are discarded.

So, with this in mind, it would seem to me that if you have a progressively shot sequence but you interpret as interlaced, you are throwing away half of your vertical resolution for no reason. But that resolution is still there in the file. Which means that, if I did indeed export my raw footage as an interlaced video years ago, all I need to do to retrieve those "lost" lines is to tell Premiere that, actually, this isn't interlaced; this is progressive. I'm telling it "do not separate fields, treat each frame as a frame". Right? Normally, this would result in "combing" but in this case the file shouldn't contain any combing because the fields were taken from the same frame.

Logically, then, if I treat everything as progressive from now on, I should have the full (SD) resolution intact. I should export clips as progressive, interpret them in AE as progressive, export them back into PP as progressive. There should be no problem... right?

But there does appear to be a problem, and I'm not sure why. In Premiere Pro, when the .AVI files are interpreted as progressive, there is a distinct loss of quality apparent, with 'stepping' artifacts visible on curved or diagonal edges. Conversely, when I interpret the .AVI as interlaced (LFF), the quality seems to improve! Edges become more defined, the stepping problem goes away and the picture takes on a sharper quality (I mean, as sharp as you can expect with SD footage, but you know). When sweeping though the timeline, the quality shifts back down until, after stopping for a second, it snaps back to better quality again. What is happening here?

Bear in mind, I'm working within a progressive sequence.

I'm trying to understand what is happening, whether it is what appears to be happening, and why it's happening. Normally, I would say "do whatever looks best" and I probably will, but I would really like to know WHY, and what the best practise should be going forwards.

Help me understand, and be as geekily technical as you like. ☺

How is the interpretation of the video file changing the apparent quality in Premiere's preview window? How will this affect quality when exporting from a progressive sequence? Is there actually any difference between 25fps footage that is exported as interlaced and the same footage exported as progressive, or is it entirely determined by how I interpret that exported file? What is Premiere actually "doing" when it puts an interlaced video into a progressive sequence, and is what I'm seeing really what I'm getting?

I appreciate this is an oddly specific scenario and that the normal "best practises" for dealing with interlaced video probably don't apply to it, which is why I haven't been able to find an answer. In fact, the answers I've found seem to be the reverse of what I've observed, so I'm totally confused! I'm also relying on my failing memory and a few assumptions.

Phew! Any takers? :)


Return to posts index

Ann Bens
Re: Unravelling a web of interlacing (a question of interpretation)
on Aug 9, 2018 at 5:37:05 pm

Can you sum up your problem in a few sentences...

-----------------------------------------------
Adobe Certified Expert Premiere Pro CS2/CS6/CC
Adobe Community Professional


Return to posts index

Daniel James
Re: Unravelling a web of interlacing (a question of interpretation)
on Aug 10, 2018 at 8:50:13 am

Ha, sorry! I'll try. 😅

Basically, I have a 25fps AVI file that was created (converted) some time ago from a series of progressive 25fps MOV files, but it was most likely exported as interlaced at the time, from an interlaced sequence in Premiere Pro.

I'm trying to determine whether the full frame data from the original MOV files still "exists" within this AVI file and, if so, how is the best way to deal with it in regards to interpreting it when bringing it back into PP.


---------


I've found somebody on the Adobe forums with basically the same question:

https://forums.adobe.com/message/3318353

Theirs was about a telecined film, but same thing. They determined, like I have, that since each field of the video is taken from the same frame of the film, that the video is technically progressive even if it wasn't exported that way.

Which is why I was very confused to see Premiere Pro showing me a worse quality image* when interpretting this file as progressive.

HOWEVER, having now checked the same AVI file in After Effects, I can see the full frame quality in progressive, and softer quality when interlaced, exactly as I would expect.

So it seems the problem is specific to the way PPro is DISPLAYING the picture in its monitor, not how the final output will look. In fact, when exporting some of that footage from PPro into AE, the picture quality looks identical to the original.

PROBLEM SOLVED, then. But I still don't know WHY PPro is displaying such poor quality in its monitor if I have everything set to progressive. There must be some trait of the video, its metadata or just how its fields are stored, that is throwing PP off?

(*Unusually chunky pixels and poor definition, I'll share some screenshots when I can.)


Return to posts index


Ann Bens
Re: Unravelling a web of interlacing (a question of interpretation)
on Aug 10, 2018 at 9:04:00 pm

Drag the clip into the New item icon and it will give you a matching sequence.
You can also check the file with Mediainfo.

Whether it was a interlaced timeline or not you will never know.

-----------------------------------------------
Adobe Certified Expert Premiere Pro CS2/CS6/CC
Adobe Community Professional


Return to posts index

Daniel James
Re: Unravelling a web of interlacing (a question of interpretation)
on Aug 11, 2018 at 3:24:43 pm

Both MediaInfo and PPro's built in detection think it is an interlaced video, no doubt because it was exported that way.

I've determined that both fields can be combined into whole frames, so the only issue now is why PPro is messing up the image in its monitor.

But it's not particularly important. Thanks anyway. ☺


Return to posts index

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
© 2018 CreativeCOW.net All Rights Reserved
[TOP]