What can c4d be paired with for live, motion-tracked visual fx?
This may be a little off-topic for the C4D forum, but here goes: I've been thinking of ways to create 3D visual accompaniment to live performers. For example, an animation that is tracked to a dancer's body, or particles that appear to emit from a musician's instrument. Someone said that Kinect Standalone V2 could be paired with C4D, and utilize the built-in tracker within the software. Is there anyone here who is knowledgeable about this sort of thing? Is there a different combination of hardware and software that has worked well for you in terms of live motion-tracked imagery?
Here are a couple videos that sparked my interest. One is mainly 3D (and so far out of my league that I'm making it a long term goal). The other is primarily 2D, but in some ways, is closer to my aesthetic taste than the first. Both are very advanced, and I'm just gathering as much information as I can, so I appreciate any and all advice that will help me get started!
Also, for motion-tracked 2D effects, would you recommend Processing, Quartz Composer, or Touchdesigner?
Thank you so much!
A couple of things to keep in mind. C4d is not a real time renderer. And while you may have a killer card with a bagillion cores that will play a sequence of things in real time, it probably won't start in real time. Add to that delay, the lag of the motion tracking processing system and the camera sensor and its offloader, then add to that the lag of the projector and video display system. Any one of these is significant, put them together and it can really add up.
Even using a kinect there is a lag when the hardware or software aren't dialed right in or dedicated to those tasks.
As much as Processing was a terrific tool and led to all sorts of great things like Arduino, I wouldn't call it speedy. I don't know if that's the app itself or the compiler or the handoff that happens in between.
I'm not sure if Unity would be a good choice or not - its fast enough for gaming but there is a bunch of overhead there that you don't need. But real time for sure and the renders get better every day.
If you are thinking about generating canned 3D movies in C4d that play over a live actor, that can work, but remember live people don't have a smpte track so they don't always stick to the plan - its why they have live orchestras for live performances because anything can and will happen to delay a cue. A live generated system triggered by both actor and tech are the best bet. But often you need to have something that just can't be rendered in real time so you can supplement with canned things that the tracker places.
We've done tracking and real time reactive particle systems in both Java and C+, and compiled, they run at a decent clip (paired with a kinect, or with a camera(s) and some machine vision tracking). We've even done some in Flash (although it started to choke when the particles got too heavy or the stage got too big)
For those with way too much money there's CoolLux (it may be under a different name now as they keep getting bought out) and Green Hippo. Both are happy to work with you to customize your needs.
But you can hack something together in just about anything that can take a camera or kinect input and output to a video card. Just check how fast that input and output happen because there is a lot going on there.
Again, I assume you are talking real-time - if not just say so and we can recommend some tracking solutions that don't work in real time.
Thank you for your answer, Steve! Yes I do mean real-time, like a situation where there is no script, per se--but perhaps some predictable/repetitive types of motion. Ultimately, my goal is to do music-based visuals for bands. Things like music notes/particles exploding out of guitars, in a way that follows the guitarist as he moves around. The particle programming you mentioned sounds extremely interesting, and I'd aim for a toned down and simpler version of that in my own work. Actually, much of my needs for this could be covered by 2D--so maybe it's not important that I use Unity or c4d for now; although I might try some canned 3d effects too. To make things simpler, I'd like to follow the method with the least amount of coding. I'm still new to it, but do aim to delve deeper into it in future projects. It's not important that the tracking be accurate to the second, and I'm also open to all solutions that are a compromise between canned stuff and motion tracking. I would love to hear more on the live generated system you suggest. Any detail you can provide is immensely helpful to me in researching the possibilities. Thank you again for your informative reply!!
The Unity engine is even faster with just 2D. Its a nice package because you can use art or timelines or programming or all three to achieve your task. And there is a free version! It plays well with others - even Flash! and Flash doesn't like playing with anything!
You shouldn't have to worry about lag at all. We were fretting over 14 milliseconds of projector lag so a second or so is an eternity.
You could also use C4d (if that's what you are used to) to make elements that would get used on sprites in your real time particle package. Particular does this in AE - for instance it uses volumetric renders of clouds done in C4d - little snippet movies- and puts them on thousands of sprites to simulate volumetric smoke or explosions - its very convincing but doesn't take the volumetric render hit.
There's so much already built for the kinect (from a input/output/coding point of view), even a beginner can get the data they want from it and put it to use in a whole plethora of packages - and for what its doing its very very fast. You just have to explain to your significant other why you are making odd gestures in front of your "web cam" in the middle of the night.
If you are going to use guitars as a mapping surface you could ask the band to put little tracking dots on the guitars - these could be UV or IR sensitive, so you might not even see them (but the camera would) but at the very least they wouldn't show up as traditional tracking dots. That would add another layer of object confirmation for the camera system and would help with orientation (when the camera sees the guitar on oblique angles - research "fiduciary markers"). You can also build a library of shapes - all the ways a guitar can be oriented. The camera whips through that library every frame and returns a "hey, I know what that is and which way its facing" set of data.
There's a lot of library's out there for face recognition and body recognition. And there was some new stuff released just the other day regarding real time face replacement for movies (think "Tarkin" in Star Wars). While you don't need that level of sophistication, when you have that kind of algorithm, you can do what your are after faster than the blink of an eye. I'll find the link and post it.
Its a good field to get into because AR and VR (in real time) are going to be all the rage when the computing power and algorithms get just a little bit better.
This is insanely helpful--thank you, Steve! I've downloaded the free version of Unity and am following tutorials to start learning it. Can't wait to try the 3D sprite particles, and also give IR sensitive tracking points on the instruments a try-- it sounds like a smart idea. I'm sure that the musicians I work with in future would be willing to put them on for the sake of accuracy. Now if I may just ask a complete beginner question: What version of Kinect do you use, and do you happen to know which to buy for Mac? I have a MacBook Pro 15" (OS X Sierra) with thunderbolt 3 output, and it's been hard to find information on compatibility. Some people say to get Version 2 and buy the power supply separately, and others say that Kinect for Mac is actually just a software that mimics Kinect (not sure how this possible). But there's so much conflicting info that I thought I'd ask users directly. I know that I don't want to use it in conjunction with Xbox, so I need the standalone version. But other than that, I'm a little unsure. Thank you so much for the time and patience you put in to your answers! I keep referring back as needed, and it's helping me a lot in narrowing down strategies and tools.
One seems to be more choreographed than tracked with realtime graphics.