sync multichannel audio during a 2 camera shoot.
Hello out there in audioland. I'm a composer, performer, and audio engineer. I have some video experience, but I've run into an issue I've never encountered before, so I'm hoping for some advice.
I have a modern music ensemble that's going to be shooting green screen video for a multimedia music piece. We are shooting with 2 cameras and recording audio (24 bit 192 k) with 4 to 6 microphones, with the intention of mixing down to stereo in the final mix.
If we were just recording stereo, no problem. I have plenty of experience doing that. Adding the multichannel audio seems to mean that we can't record straight to a camera.
My backup plan is to shoot the cameras to tape and record the audio on a daw, then manually sync in post. This seems like a huge pain.
I've been investigating direct to disc applications, and I'm wondering if this is the answer. Can someone suggest a good tool to capture the video from 2 cameras, and sync the 4 to 6 channels of audio coming in from my asio firewire soundcard?
I have access to Vegas 9 and the Adobe production suite. I'm good with Vegas, brand new to Adobe.
Ideally, whatever tool I shoot with will ne able to export the synchronized audio and video to Vegas for the post work, which involves creating a bunch of audio/video loops. The musicians will be playing with a.click track.
Hope this makes sense. Thanks in advance! My first post here!
Normally you use a master TC generator, and slave all your equipment to that.
But it highly depends on the equipment used.
Now you can always use a spare channel to record the TC (it's an audio signal) and decode it afterwards. But if you have cams with a TC input, that is not needed. (Could be needed for your sound recorder though.)
If your soundrecorder cannot slave to TC, i have software that can convert Wave files to BWF, based on the audio TC.
Same can be done for video, but that will be Quicktime based then.
No idea if you intend to use Quicktime, nor if Vegas supports Quicktime TC.
So please elaborate on the equipment used!
smart tools for video pros
You don't mention what you intend to use to record the audio. Whatever it is you must be able to sync it as Bouke suggests. There's the Sound Devices 788t that has timecode, but only records 96kHz. However, I'm not sure if there's even an easy way to play 192kHz audio with video. You'd probably have to play your video off one device which would be chased by a Pro Tools - or other - DAW playing out the audio.
MacPro 8-core 2.8GHz 8 GB RAM OS 10.5.8 QT7.6.4 Kona 3 Dual Cinema 23 ATI Radeon HD 3870, 24" TV-Logic Monitor, ATTO ExpressSAS R380 RAID Adapter, PDE enclosure with 8-drive 6TB RAID 5
FCS 3 (FCP 7.0.3, Motion 4.0.3, Comp 3.5.3, DVDSP 4.2.2, Color 1.5.3)
Pro Tools HD w SYNC IO & 192 Digital I/O, Yamaha DM1000, Millennia Media HV-3C, Neumann U87, Schoeps Mk41 mics, Genelec Monitors, PrimaLT ISDN
I have done several live concert videos this way, WITHOUT using cameras or recorders capable of timecode.
My workflow is: Record the music ensemble multi-mic, multi-track audio (I have been using Alesis HD24 hard drive recorder). And then mix-down the audio the way you want it (as if it were an audio-only production).
The cameras will pick up a "scratch track" audio with the on-camera microphones (likely the only time the on-board mic is handy). Or, if you are particularly ambitious, you could create a live stereo "scratch mix" and feed that to the cameras, but IMHO that is more trouble than it's worth.
Then take the mixed-down stereo track and put it into your non-linear editor (NLE). THE MIXED-DOWN AUDIO TRACK IS NOW THE MASTER FOR THE WHOLE PRODUCTION. Now you can take the video from the cameras and lay them into the video tracks in the NLE. You can use the sound from the camera video to slide the video forward and back to get it perfectly in sync with the master audio mixed-down track. Don't worry about the video going out of sync. Since you are using multiple cameras, any place you switch from one camera to another, you can use that edit point to "pull up" the video to sync with the master sound track. Of course, after you get everything synced, you mute/remove the audio from the cameras, leaving only the master mix-down.
This sounds more difficult than it is to actually do it. Of course, this would not be efficient if you were doing this all the time, or if you were on a very tight editing schedule, etc. Professionals doing high-end productions (feature film, prime-time TV, etc.) use timecode to automatically handle the synchronizing task. But with modern NLEs it is ever so much easier to do manually, on the cheap.
There is even software available as a plug-in for the NLE that will analyze the audio tracks from the cameras and slide them into place to sync with the master audio track.
If you can find another camera or two, it would be very advantageous. It is nice to have a locked-down wide shot as a "safety" to cut to if you don't have anything else appropriate. It is also nice to have a hand-held camera to get reverse-angle audience shots, or shots from behind the musicians (up-stage, if accessible) etc. And camera operators who know the music is also a plus, so they can make shots and moves appropriate for the mood, style and tempo of the music.