FORUMS: list search recent posts

sticking textures and images to moving face... Configurable DIY Snapchat filters

COW Forums : Adobe After Effects

<< PREVIOUS   •   FAQ   •   VIEW ALL   •   PRINT   •   NEXT >>
Greg Sage
sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 5:22:43 am
Last Edited By Greg Sage on Jun 11, 2019 at 9:38:09 am

I have video of talking head... just a bust of someone facing the camera and seaking in front of greenscreen, and I need to paint a flag on thier face so it moves with their facial motions. I can get a decent enough wrap on a still frame just using luminance values as displacement map and overlay mode for the flag over a desaturated face

Unfortunately, while the displacement shifts as the luma values in the face do, it's like a projection of a flag onto a face vs the flag really sticking to the facial geometry

I thought most convincing results might be found by tracking a variety of facial planes via Mocha Pro, but I'm just no seeing how the various parts of the flag can be comped together across those planes so the whole thing flows as one image. There's a besic deformation grid, but not seeing any smart way to animate the changes.

Also want they eyes, mouth to not be painted when open, so assuming that requires rotoscoping.

Just frustrated at this point, so want to take a step back and ask if maybe there's some other tool or technique I can be using... or an efffective way to combine all those individual face planes so they produce one smoothly flowing image.

Seems like what I really need is something like tracking a number of spots on the face, and using that PSR data to drive points in a mesh warp. Is there something that works like that?




I see snapchat and other phone apps that are essentially doing a real-time analysis of facial tracking in real time where the facial features move with you... or drive things like like Adobe character animation.

Is there some way to basically use tools to diy a custom version of something like that that outputs a warped version of the input image that can then be composited onto the face using the various modes, masks,and other tools in AE?


Return to posts index

Kalleheikki Kannisto
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 10:06:49 am

I haven't seen such a tool for AE. You may want to look into Nuke.







Kalleheikki Kannisto
Senior Graphic Designer


Return to posts index

Greg Sage
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 5:03:59 pm

Yeah. That would do it. Seems like overkill workflow in my case, though.

I've got a side and front shot of the same talking head , but it's all low res 720p footage, and I'll likely never need to do this again. Shots don't even move much, so any fully automated tools would likely work fine.

Any simplified versions or other approaches that would provide "good enough" results in my simplified case?


Return to posts index


Greg Sage
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 10:16:15 am

Basically, is there anything that does this:

https://www.banuba.com/facearsdk

as an AE plugin, or standalone that can allow me to paint on a face so it displaces to match the facial geometry, and tracks with the various facial deformations?


Return to posts index

Dirk de Jong
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 12:21:37 pm

[Greg Sage] "I need to paint a flag on thier face so it moves with their facial motions"

I don't know of a good way to do it in AE. The way the tools from Facebook (for authoring Facebook and Instagram filters) or Snapchat (for authoring Snapchat filters) do it is that they have a face mesh (a default built in mesh or you can load your own) that follows the face and deforms along with the face based on face tracking. And you can map your own texture to that mesh and use transparency and blend modes etc. to make it look like the texture is "painted on" to the original face. I've done some of this and recently posted a screencap to show someone what it looks like and what some of the capabilities are. [ http://kingluma.com/augmented.mp4 ] starting at about the 40 second mark this video shows the Facebook standalone (called Spark AR Studio) - The Snapchat app is called "Lens Studio". AFAIK these apps don't have export functionality so they're not ideal for use as a post processing tool on a video file... but their tracking algorithms seem pretty good : )


Return to posts index

Greg Sage
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 5:00:07 pm

I read this just after stumbling across Spark AR.

I got super excited about spark until I realized they had probably crippled export capabilities... or at least limited output resolutions, etc as those wouldn't be necessary features for it's intended use.

I was just about to download it. Maybe I shouldn't waste the time, though. Just how limeted (or non-existant) is the export? Is any of the data or footage able to be exported in any way? I see a few details in the Getting started docs, but not sure if they are referring to the final output of the filters as employed on FB, or if they are referring to the downloaded development tool.


Return to posts index


Dave LaRonde
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 4:11:29 pm
Last Edited By Dave LaRonde on Jun 11, 2019 at 4:14:43 pm

[Greg Sage] "Just frustrated at this point, so want to take a step back and ask if maybe there's some other tool or technique I can be using... or an efffective way to combine all those individual face planes so they produce one smoothly flowing image"

1) In recent years, Adobe has been focused on making After Effects into a mere stop on the marketing sausage factory, wherein one takes data from outside sources and turns it into video on web sites in many permutations, creativity be damned.

2) Snapchat has the luxury of making a narrowly-defined tool for a specific purpose. ... and it's new, which is no small thing. After Effects at its heart remains a 25-year-old Swiss Army Knife for video. Ever make a piece of fine cabinetry using just a Swiss Army Knife?
You can, you know....

Dave LaRonde
Promotion Producer
KGAN (CBS) & KFXA (Fox) Cedar Rapids, IA


Return to posts index

Greg Sage
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 6:12:55 pm

Understood. Unfortunately, I need to get this out in 48 hours now, and it's turning out to be a much bigger issue than anticipated.

Looks like Nuke might do it, but I don't know that I have enough time to iron out the new workflow issues, and looking at Spark and Lens Studio, not sure I can do anything with what it produces.

Simplest (good enough) solution since I'm already familiar with AE would seem to be just using facial tracking data to move puppet pushpins or similar where I take a stillframe of the rest pose, paint it, then use the moving tracked pins to warp the painted version. This would also allow me to make a much larger resolution image by doing the whole thing at 4x the footage's actual resolution of 1280x720 since I'm replacing the actual face texture anyway, and only using luma values which I was going to do a basic beauty blur on anyway.

To that end, I feel like I'm missing something in all the AE face tracking tuts. Even on my straight ahead face that never turns and is well lit in front of a greenscreen, it's tracking data is garbage. None of the tuts seem to have this issue, and I see very few controls. Is there some way to manually set all the pins first, and then have it track them?

Some other 2d (without creating a 3d face mesh) facial tracker that works better and can export it's tracking data into AE? Some other hybrid workflow including Lens, Spark, Crazytalk, or similar?


Return to posts index

Dirk de Jong
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 6:51:06 pm

[Greg Sage] "Some other hybrid workflow including Lens, Spark, Crazytalk, or similar?"

here's a kluge (which believe it or not I've actually used before)...

It's possible in Spark AR to show only the flag texture (as mapped to the face mesh) over a flat solid color (maybe black, white, bright green) without actually showing the source video at all (but the source video is still being used for the tracking data) - and then screen cap that, then take the screen cap video into AE (position and scale as needed) and then composite it with the original talking head shot using blend mode and/or key - can actually look OK depending on the context

in Spark AR the way I made the solid color was to create a 3D plane (with a flat color material on it) and had it facing the camera and pushed back in 3D space so it was behind the face mesh - this obscures/hides the source video since I seem to remember that there is no built in way to simply disable the visibility of the source video

what's the duration of the shot you need to do ?


Return to posts index


Greg Sage
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 7:40:55 pm

I have a weird situation (part of why I was trying to do in AE to begin with.)

I have a 4 minute clip of a cubist face I've composited from side and front shots of the same talking head in front of a green screen. Similar to this:



But it's video, and he's talking. Both shots are already motion stabilized, and I've already comped it together in AE. The cubist comping of the 2 shots requires warping each shot to get them to fit together correctly.

Resulting cubist comp may or may not be recognized as a face by facial tracking softwares, but it would be perferable to track it as such since that means textures and paint I overlay would help to hide the warps where, for instance, nose doesn't quite move correctly with cheek as they're comped together.

If necessary, I could track the front and side shots separately, paint them separately, then do the comping again from the painted versions of each, but then the warps would be more noticeable as they'd be baked into the comping.


Bottom line: I can either track and paint the comped cubist version (preferred path), or (if necessary) track the front and side shots separately (using either original footage or my pre-stabilized versions), paint them separately, then comp them together later. Depends really on whether software will allow me to track comped version.




So... would I then take the rest frame into PS, and paint it, then use that as the texture?


Return to posts index

Dirk de Jong
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 8:03:48 pm

[Greg Sage] "Resulting cubist comp may or may not be recognized as a face by facial tracking softwares,"

yeah that'd probably be a huge problem for the social media filter authoring software face tracking (the cubist face)


Return to posts index

Greg Sage
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 8:18:23 pm

So in screen capture method you're talking about... am I understanding correctly that the screen capture is then the painted version of the face over a black background?

Of course, that's assuming I'm using a face it can track, and that I can find some way to do the side bit. It's really just the nose and the alpha edge being used from the side, though, so I could just paint the nose separately, and use the alpha edge for the comping.

Just being able to paint the front face would get me halfway there.


Return to posts index


Greg Sage
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 8:09:29 pm

I have a weird situation (part of why I was trying to do in AE to begin with.)

I have a 4 minute clip of a cubist face I've composited from side and front shots of the same talking head in front of a green screen. Similar to this:



But it's video, and he's talking. Both shots are already motion stabilized, and I've already comped it together in AE. The cubist comping of the 2 shots requires warping each shot to get them to fit together correctly.

Resulting cubist comp may or may not be recognized as a face by facial tracking softwares, but it would be perferable to track it as such since that means textures and paint I overlay would help to hide the warps where, for instance, nose doesn't quite move correctly with cheek as they're comped together.

If necessary, I could track the front and side shots separately, paint them separately, then do the comping again from the painted versions of each, but then the warps would be more noticeable as they'd be baked into the comping.


Bottom line: I can either track and paint the comped cubist version (preferred path), or (if necessary) track the front and side shots separately (using either original footage or my pre-stabilized versions), paint them separately, then comp them together later. Depends really on whether software will allow me to track comped version.




So... would I then take the rest frame into PS, and paint it, then use that as the texture?



EDIT: Just thought of something. Maybe I'm looking at this all wrong, and there may be a third approach. The comping was a major pain because of all the skewing where even a slight turn in one direction translates differently in the front vs side shot. Since the head is being painted anyway, it might just be better to create a single talking 3d head driven by the front shot's tracking data (and possibly texture information from side shot if it's helpful for the head wrap)... then just render out the head mesh with the texture from both front and side angles and do the comping from that. In theory, it should be much more stable.

Speaking of stable: Trying to think through... where exactly in this workflow (or the one you mention with screen grab) would I lock down a particular tracking point if I want, for instance, to keep the left eye socket perfectly motionless so all motion translates from the center of the eye socket (not the pupil since it may move)?


Return to posts index

Greg Sage
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 8:13:27 pm

I find the layout of this forum quite confusing. Dbl post here was entirely unnecessary. Why is there no edit button?


Return to posts index

Ross Shain
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 7:38:53 pm

Mocha Pro has a Mesh Warper, which will allow you to animate a mesh for organic tracks. This is applied to one tracked surface. https://library.creativecow.net/g_tobias/scars-tattoos-digital-makeup-adobe...

For a more advanced techniques, you can parent multiple Mocha tracked surfaces to the Puppet Tool. This video might be useful: https://borisfx.com/videos/mocha-and-after-effects-puppet-tool/

Or both Nuke and Autodesk Flame have tools for organic surface mapping:. Nuke's is called Smart Vectors:


Ross Shain
Boris FX / Imagineer Systems
https://borisfx.com/products/mocha/


Return to posts index


Greg Sage
Re: sticking textures and images to moving face... Configurable DIY Snapchat filters
on Jun 11, 2019 at 7:59:51 pm

Unless I'm missing something, mesh warper would seem to need manual keyframes each time facial geometry shifts. Seems better suited to mapping flat things onto surfaces that are warped but stable like a cylinder.

I had somehow missed the second vid. That sure looks a lot closer, and was more or less what I was trying to do yesterday, but got confused trying to do the multiple corner pin morph across planes bit.

Looking into Nuke now.


Return to posts index

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
© 2019 CreativeCOW.net All Rights Reserved
[TOP]