I've seen and heard of a RT process of lip syncing or animating using video input information and a 2d animated charecter and having the person through video effect the charecters animation. Ex. The Real life model Raises his eyebrow in Real life and in Real time this information on a computer moniter has it so it moves the chrecters eye brow as well. It was not a motion capture suit or procedure. Any clues?
It is machine vision and real-time motion capture. I have seen software that uses a camera shot closeup on your lips or face to recognize changes and drive a model face in real time, but I can't remember a name off the top of my head. Try looking for a link to this stuff on turbosquid. They probably advertise there.
There are also systems that use puppeteering techniques. These are all pretty spendy.
On the lower end, what some folks do is use a MIDI keyboard or keybad with assignable functions, like those for WOW or CAC video games, or even custom apps for wii controllers, and assign a key or motion to each "muscle" of the model, to "play" it in real time.