*well, I started playing with Adobe Character Animator…
…but looked for an option to use some of it's functionality inside of After Effects.
Character Animator can generate visemes (trigger mouth shapes) via mic in real time or compute them from scene audio. Even though you could use the the image sequence of visemes from CH in AE, being able to use keyframes in AE gives you even more possibilities.
As there is no option to use/copy any data/keyframes from CH to AE, I made a simple set-up, which will provide you with keyframes at the end, matching your audio, to use in the standard workflow, driving the time remap of a mouth shapes comp.
The idea is simple:
In addition to the animated visemes, flashing dots get rendered on specified positions, which then will be read/sampled in AE. This expression can be converted to keyframes.
If you have access to Adobe Character Animator / are subscribed to CC, give it a try if in need for some lip syncing.