FORUMS: list search recent posts

Merging distributed QuickTime segments

COW Forums : Compression Techniques

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
Christina Crawley
Merging distributed QuickTime segments
on Jul 7, 2008 at 5:43:59 pm

Hi all,

I'm new to the whole distributed processing / Qmaster clusters in Compressor 3. The clip I'm working with is; DVCProHD 720p60 = 58.18 GB. I setup a conversion to Pro Res HQ 1080i60 (1920 x 1080). Currently the cluster is setup to only use 7 CPUs (instances) from my Xserve (services only) via ethernet. Currently my Mac Pro is the controller but the CPUs aren't being serviced.

The clip is about 2 hours and 19 minutes so I know I'm asking a lot, but, the processing time of merging the distributed Quicktime segments is taking forever (on top of the 2 hours to transcode).

The status bar is not active to so I'm just in "wait and see" mode. I hate "wait and see" mode.

The only info I can find on this topic is here:
http://discussions.apple.com/thread.jspa?threadID=1107862&tstart=75

The guy just rants and no one offered a solution. Anyone have a solution?

Overall this cluster still may be a time saver (initial transcoding time was in excess of 5 hours) but I just want to be as efficient as possible.

Thanks!

- Christina


Return to posts index

David Burch
Re: Merging distributed QuickTime segments
on Sep 1, 2009 at 6:18:43 am

I'm having the same issue, except I'm using a virtual quick cluster to take advantage of my quad core system. I'm trying to convert XDCAM EX 720p60 footage to ProRes 422 (also long clips, in excess of an hour and a half each). I am finding that it is taking longer to use distributed processing because of this than it would take to simply use the local machine with default settings. Does anybody have any work-arounds for this? I would very much like to take advantage of multiple cores, as I generally deal with large amounts of long HD clips, and encoding is by far the biggest bottleneck at this point. Thanks!

Dave


Return to posts index

David Burch
Re: Merging distributed QuickTime segments
on Sep 1, 2009 at 6:27:49 am

I should probably add that I am using a 2.66 Ghz quad core Xeon Mac Pro, running Leopard 10.5.8 and using the latest version of FCS 3.0. I have 12 GB of 667 MHz DDR2 FB-DIMM RAM. I am trying to cross-convert these files for editing, using the multiclip feature in FCP.

The reason I want to convert to ProRes 422 first is, again, so I can make use of multiple cores when compressing for DVD. Right now, a 2 hour XDCAM EX timeline takes over 5 hours to export from Final Cut Pro to ProRes 422, and over 15 hours if I go straight out of Final Cut Pro into Compressor (when downscaling to MPEG-2). Of course, distributed encoding does not work if Final Cut Pro is rendering for Compressor, so I wanted to output to ProRes 422 first (keeping all my chapter marks) and send the final clip to Compressor. I generally need to output a reference file anyway for use in Logic when creating the Dolby 5.1 surround mix, so this seemed like a reasonable workflow.

I figured that if I start with ProRes 422 to begin with, the export from Final Cut Pro should take a lot less time (theoretically). Traditionally, my reference file had been an XDCAM EX file, as it take far less time to export the native codec than it does to convert to ProRes 422 (which also puzzles me).

Anyway this is probably much more info than anybody needed to know, but I thought I'd share as much as possible so people could get a good idea of what I am trying to do here :)


Return to posts index


David Burch
Re: Merging distributed QuickTime segments
on Sep 1, 2009 at 6:38:09 am

Upon further investigation of the log files, I compared one clip that had successfully completed with another that is still trying to "merge distributed quicktime segments". I found that the problem file contained this error:

exception = error: FlattenMovieDataToDataRef, status=-2019

I searched the logfile of the clip that finished and did not find any reference to this exception whatsoever. If anybody is savvy in quicktime exceptions and what this could mean to the distributed compression process, this might be helpful information.


Return to posts index

Trevor Allin
Re: Merging distributed QuickTime segments
on Oct 6, 2009 at 6:46:34 am

Hi

I am having this problem too, the time is killing me.

Anyone get any further on this issue?

Thanks

Trevor


Return to posts index

Jeremy Belzer-Adams
Re: Merging distributed QuickTime segments
on Nov 11, 2009 at 7:16:51 pm

Did any of you figure this out? I'm having the same problem!


Return to posts index


Bruce Little
Re: Merging distributed QuickTime segments
on Oct 26, 2010 at 5:30:19 am

I am also having this problem with new 12core 2.66 / 12GB RAM.
is there any updates to this?


Return to posts index

Dave Lindsay
Re: Merging distributed QuickTime segments
on Jan 13, 2011 at 7:51:24 pm

There must be some un-optimized or rate-limiting bit of code in qmaster... i can't explain why the "Status: Processing: Merging distributed QuickTime segments" takes so long.

We have a Compressor v3 setup with 24-36 virtual cores depending on config. The thing blazes through the encode process. It's beautiful how fast it runs, but when it gets back to merging segments, the disk activity does not exceed 15-20MB/sec read/write. I tested my disks on concurrent read/write (this is a simple RAID0 array) and was seeing 80-90MB/sec concurrent read and write on the same volume.

There's no knob, switch, or settings file parameter from what I can tell. It's incredibly frustrating for the disk operation to take LONGER than the encode.

Does anyone know if this also exists in Qmaster 3.5?


Return to posts index

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
© 2017 CreativeCOW.net All Rights Reserved
[TOP]