Boris CC 3.0.3 Optical Flow warping/rippling issue
Using Boris CC 3.0.3 Optical Flow filter in FCP 5.0.3, I've been getting a warping/rippling artifact along certain moving edges. I have no idea what parameter or combination of parameters "cures" this despite some experimentation.
Obviously specific settings depend on content but what setting(s) should I be taking in what direction to deal with this. I've looked at the training disc by Steve Bayes as well as the manual but I can't seem to improve things much.
I don't like FCP slow mos but if it takes hours of trial and error to do better I just don't think my clients will pay for it.
colliding patterns? The rippling you're speaking of, does it include two patterns (ie vertical lines and angular lines)? Is it specific to the footage or does it do the 'wave' all the time when you use the optical flow filter?
In one instance it's a hand moving rapidly across a black jacket. The jacket "displaces" as if it were liquid. In another case there's the same rippling in the background around a person's head as she bends down. A golf cart drives behind someone and the skirt she wares ripples as the golf cart crosses.
Basicly this rippling occurs when a moving object passes either another still or moving object. I believe there might be Min Edge Contrast, Low Velocity Correction and Velocity Limit settings that might improve this but the descriptions in the manual and in the Bayes Tutorial DVD are a bit "abstract." I don't find phrases like, "magnitude of motion estimation," "appears incorrect" aren't too descriptive.
These two are real bafflers.
"If your effect includes Optical Flow errors that are localized to small areas where the image is moves together, increase the Min. Edge Contrast to 200 or 400. If errors occur at motion boundaries, decrease Min. Edge Contrast to approximately 50."
What does "is moves together" mean? Decrease "if errors occur at motion boundaries" but increase when "localized to small areas where the images is moves together?"
"If the estimated motion is larger than the actual motion, the image will distort and the motion vector display will show long motion vectors."
I almost understand this but I thought the vectors are longer as the motion between frames is longer. Not sure how this helps show that it's overestimating the motion change.
Above from page 559 of BCC3 manual. It would helpful to have a tutorial with before and after pix exhibited the settings change and their impact.
Don't feel bad. Avid's own Timewarp effect produces similar results. I simply don't have the time to fiddle around and achieve a usable effect, so it isn't a part of my "palette" right now.
IIRC Boris Red used to have a demo using a guy on a surfboard to illustrate how smoothly water spray passes over the subject, but I never saw what the filter settings were.
I have seen your footage and this is a pretty good example of "occlusion". As with all optical flow algorythms, the smoothness of the motion effect is created by predicting where pixels will be in the next field. When you have objects that come in from out of frame or cross over other objects, then they are momentarily occluded. The ability to predict where a pixel will be is messed up since the optical flow algorythm assumes a pixel will be one place and then it is blocked by another object or edge of frame. Optical flow doesn't "see" objects, just individual pixels that appear to be travelling in a particular direction at a particular speed. So yes, it is a bit abstract to explain.
The short answer is that there will be particular shots which will work perfectly with any optical flow effect, others that can be improved with the controls shown in the DVD and some which will be better served with another form of motion effect. As a very last resort you might try to use some masking to make part of the image optical flow and another part of the image masked to use a different type of motion effect.
PS except for the moving cart and the funky hand you mentioned, I thought the effect looked pretty good and I was the designer for the Avid optical flow.
Media 100/Boris FX
Thanks for your response. I understand the occlusion issue. As a sometime compressionist (I believe I spoke to you at NAB Post NYC about WMV output on Media100 using Flip4Mac at the very end of the show - they sent Boris the 1.0.6 update) I was thinking in terms of IBP type GOPs in which an algorithm can be told where to look for an I frame. In this case B/P type frames would be the newly created frames. This could helped by a 2 pass encode, wouldn't it?
I'm having some confusion about some of the settings like for example Bi Directional Mix and Nearest Mix as sort of look both ahead and behind vs look at the nearest (either/or) I frame from a current B/P type frame, in predicting motion. The preconceived notions I had might be leading to my misinterpretation of these functions in Optical Flow.
I also thought that maybe Low Velocity Correction ("If the Optical Flow moves an area that should not move") and Velocity Limit ("percentage of the size and resolution of the Source Layer") as sort of "sensitivity" adjustments as to both where it looks ahead/behind in the in the nearest frame (sort of like the region used in Image stabilization). My thinking was that a faster moving object might need to look at a wider region in the frame to find where it came from/should go to.
I'm also confused about Min. Edge Contrast which "sets the threshold for the minimum amount of detail in a region." Just not sure how edge detail or contrast on those edges are affected by Optical Flow given one is avoiding the softening of Frame Blending.
I thought displaying the Vectors would help in my tweaking but I find the vectors aren't visible in Draft Only. They are when set to Render, which I can then turn off before the render. The problem is when doing Option-p (FCP - Play all Frames) and try to park on a frame at issue, it jumps to another frame.
I know other editors are grappling with these concepts. Any help on the above would be greatly appreciated.
This calls for an engineering response! I am afraid the marketing guys can only fake it so much ;-)
I will track down who worked on this UI to give you a better answer.
Third Planet Video
[Steve Bayes] "This calls for an engineering response! I am afraid the marketing guys can only fake it so much ;-)
I will track down who worked on this UI to give you a better answer.
Here are the best answers we can wrangle from engineering. Thanks to Emile Tobenfeld (aka Dr. T) and Mike Massey:
>> > > Directional Mix and Nearest Mix as sort of look both ahead and behind vs
>> > > look at the nearest (either/or) I frame from a current B/P
>>type frame, in
The difference between nearest mix and bidirectional is that nearest
applies the optical flow to the nearest source frame, while
bidirectional applies it to both adjacent source frames and mixes.
If the optical flow were 100% accurate, these would give the same
result. In the real world of using this filter, the advantage of
nearest is that each output frame uses only one input frame so you
won't see an overlaid image at any point. The advantage of
bidirectional is continuity. If you do crazy things (as I did just
last night) like slow down abstract moving water footage by a factor
of 10, nearest will give noticeable discontinuity when you switch
from one source frame to another.
I suspect that bidirectional is likely the better choice at velocity < 50%, and
nearest likely the better choice at higher velocity.
The best way to learn this filter is to take a 1 second piece of
footage and render out multiple variations and look at them.
> My thinking was
> that a faster moving object might need to look at a wider region in the
> frame to find where it came from/should go to.
Low Velocity correction kicks in when the motion estimator finds motion
perpendicular to an edge. In that case (an edge), and little or no
motion, Low Velicity correction will constrain the motion vector. An
example can be when changing lighting appears as motion.
>>> > I'm also confused about Min. Edge Contrast which "sets the
>>> threshold for the
>>> > minimum amount of detail in a region." Just not sure how edge
>>> detail or
>>> > contrast on those edges are affected by Optical Flow given one is
>>> > the softening of Frame Blending.
The motion vectors are weighted by edge contrast. No motion estimator
can detect motion along an edge. BFX optical flow relies on contrast
variations over relatively long ranges. Some footage shows artifacts
from motion detected in areas without long-range contrast, such as
grass. If there is no detail, the estimator result in that region is
unreliable. Min. Edge Contrast sets a threshold for the min. level of
detail that the estimator expects.
>>> > I thought displaying the Vectors would help in my tweaking but I
>>> find the
>>> > vectors aren't visible in Draft Only. They are when set to
>>> Render, which I
>>> > can then turn off before the render. The problem is when doing
>>> > (FCP - Play all Frames) and try to park on a frame at issue, it
>>> jumps to another frame.
You may get different results (read: motion vectors) when rendering in
Draft. Draft just sends a smaller frame to the estimator.
Media 100/Boris FX
Thanks, Steve. I'll digest this. It certainly gives me more of the info I'm looking for.
I was a big fan/user of Dr.T's KCS (Keyboard Controlled Sequencer) Amiga version. Nothiing like it for doing "experimental" avant garde music stuff. So when is Boris going to come up with a new version? ;-) Well, at least you should market, "Dr.T's FX plug-ins." I actually saw Emile perform in NYC.