FORUMS: list search recent posts

NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...

COW Forums : Apple Final Cut Pro X Debates

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
David Lawrence
NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 29, 2011 at 8:54:14 pm

This is a continuation of an interesting sub-thread buried under a completely unrelated topic. If you're just tuning in, a good place to start reading would be here:

http://forums.creativecow.net/readpost/335/16693

Let's continue exploring the idea of musical composition as a metaphor for editorial process and how the language of audio-centric workflows differ in a multi-tracked open timeline vs. the magnetic timeline.

_______________________
David Lawrence
art~media~design~research
propaganda.com
publicmattersgroup.com
facebook.com/dlawrence
twitter.com/dhl


Return to posts index

Jeremy Garchow
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 29, 2011 at 9:08:39 pm

Thanks, David.


Return to posts index

Jeremy Garchow
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 12:11:06 am

Just curious, did you get a chance to answer those questions in the last post to you (from the link).


Return to posts index


David Lawrence
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 12:18:27 am

[Jeremy Garchow] "Just curious, did you get a chance to answer those questions in the last post to you (from the link)."

They're great questions. I've been distracted by work this afternoon ;) but I'll answer with some screen grabs when I get a free minute.

_______________________
David Lawrence
art~media~design~research
propaganda.com
publicmattersgroup.com
facebook.com/dlawrence
twitter.com/dhl


Return to posts index

Jeremy Garchow
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 12:29:25 am

No worries and certainly no rush. Screen grabs were pie in the sky.

Just want to make sure I didn't lose the thread! :)


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 1:46:07 am

Sound Post has been my speciality for 27 years and I have been involved in software development with Fairlight and dSP, often alpha testing. Firstly let me observe that the magnetic timeline to me seems to be a methodology to both reduce screen real estate and automatically prevent clip over writing.

Both these issues were solved over a decade ago by both dSP and Fairlight but interestingly not both by many other DAWs. Screens are dynamic on Fairlight with the ability to jump between displaying any number of tracks so it is possible to jump from an overview of 48 or 96 tracks to just a group of selected tracks. This dynamic resizing is extremely quick and intuitive. The paradigm of DAW editing is the selection of tracks to edit, not just clips so track based arrangement is important to that. NLEs are the most clumsy audio editing devices that I have encountered.

The issue of overwrite I have mentioned here before. Clips stack non destructively in Fairlight/ dSP. This means nothing is overwritten, merely hidden beneath. You can rotate the stack and reveal all the layers. The convention is only the top layer plays but you can crossfade to layers underneath. This allows very powerful subtle sub frame dialog editing on a single track. So both issue that FCPX seems to be trying to tackle have already been solved in a more powerful and elegant way.

The most important thing about tracks in DAWs is the power of track based processing. Clip based processing is terribly limited when it comes to something as mainstream as a string of dialog with multiple edits. The DAW approach is to do sophisticated dialog edits on a single track and then be able to apply dynamics, EQ or any other sort of processing to that track. This avoids having to paste a set of plugins to multiple clips and then tweak each clip. Track based processing allows for processing post fader. So dialog levels can be automated to chase a level before it goes into a compressor/ limiter.

The track base then has a further important 'role' in routing to a bus. A bus is simply like a master channel for all the dialog or music or FX tracks to go for grouping and final manipulation. The busses then need to be further routed to a master bus for the final mix. So the hierarchy is clip>track>bus>master. At all stages this hierarchy allows over arching processing. On the Fairlight I can use plugins on a clip, track, bus and master. This stacked processing is incredibly powerful and the reason why properly posted sound is beyond NLE tools. STP is not a particularly good DAW but it does follow the DAW conventions of clip>track>bus>master. Play with it and see the power of a track based hierarchy and track based processing.

So to FCPX and roles. As you can see, DAWs harness the power of a multi role model. So how in FCP X can you define audio to be a single role when roles within roles are the basics for sound post. All we ask is for an OMF to re arrange the roles based trackless edit environment of FCPX into tracks. I can assure you that whatever those tracks are, we will further refine them anyway so I don't need perfectly laid audio from a locked edit.

I cannot stress strongly enough that FCPX or any NLE is incapable of proper sound posting. Vegas comes closest. So the issue of what will ultimately work for editors needs to be thrashed out and I think the magnetic timeline + roles is an inferior approach to problems that are ancient history in DAW land.


Return to posts index


Jeremy Garchow
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 2:30:58 am

First of all, thanks for your sharing, this is great. This is kind of what I was getting at that video editing and audio post have separate processes. Whatever I hand over, it's going to change. When I get a timeline for color correction, I change it so it makes more sense for the deed at hand. Fairlight sounds cool.

[Michael Gissing] "Play with it and see the power of a track based hierarchy and track based processing."

This is how M100 used to work back around the turn of the millennium, and I miss it (clip, track, bus, master). Fcp has never had it, and it doesn't look like it will now, but for my use (which is basically keeping things separate for audio post, or if it's not going out to post, then I need tools in the timeline to work decently and efficiently), Roles can come real close with some upgrades. This is where I think of a metadata based editor being very powerful, as you can group items, seemingly disparate items, very easily and no matter where they are in the timeline. In essence, it's creating a bus, and the index is the map legend, or patch panel. Compound clips could then become tracks of a sort in that you could apply effects to multiple clips. And look, I know all of this is a stretch as it's not available today, right now.

[Michael Gissing] "So how in FCP X can you define audio to be a single role when roles within roles are the basics for sound post"

FCPX has roles within roles, but the control is a bit limited at this point, but the capability is certainly there.

[Michael Gissing] "All we ask is for an OMF to re arrange the roles based trackless edit environment of FCPX into tracks. I can assure you that whatever those tracks are, we will further refine them anyway so I don't need perfectly laid audio from a locked edit. "

Every audio post house I work with says the exact same thing, as long as nothing is mixed down, they're good.
I'm nice, and I like receiving organized projects, so I do my best to keep things as logical as possible without too much fuss before I hand it off. I think Roles will be able to allow the proper information to an OMF handoff that will equate to audio tracks, and that aren't the current 'stem' export you get now.

[Michael Gissing] "So the issue of what will ultimately work for editors needs to be thrashed out and I think the magnetic timeline + roles is an inferior approach to problems that are ancient history in DAW land."

Thanks so much for your perspective.


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 2:36:45 am

[Jeremy Garchow] "First of all, thanks for your sharing, this is great. This is kind of what I was getting at that video editing and audio post have separate processes."

I have always wondered why NLEs don't think of the clip stack per track approach for video. In a way a clip stack plus track based processing is like a nest but potentially much more powerful as each layer can be manipulated in the timeline.


Return to posts index

Jeremy Garchow
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 3:26:22 am

[Michael Gissing] "I have always wondered why NLEs don't think of the clip stack per track approach for video. In a way a clip stack plus track based processing is like a nest but potentially much more powerful as each layer can be manipulated in the timeline."

Auditions in FCPX are attempting this very notion. Not the same as you can only see one clip at a time (and can't dissolve from clip to the other clip), but they are there in the timeline at the same time and selectable.

I think that audio and video are very different, and have very different philosophies, and of course are extremely complimentary. Certainly, they can both motivate each other. When it's all said and done, there's one piece (or layer) of flattened video playing at once. For audio, there's at least two, and sometimes much more, even in a master. I really think psychologically (and physically), it stems from this.

Reatively modern sound design and recording has been about more tracks and layers, visuals are more singular. These very notions are being changed everyday with the advances in stereoscopic cinema and the complications of keeping all the in sync, and FCPX is giving an honest shot at changing these notions, or at least exploring new ways of doing things.


Return to posts index


Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 3:53:20 am

[Jeremy Garchow] "I think that audio and video are very different, and have very different philosophies, and of course are extremely complimentary. Certainly, they can both motivate each other."

I have spoken to software developers about the idea of using video composite layer techniques in audio. Straight opacity is a bit like mixing in audio terms but key, difference, luma type vision mixing is different. Similarly audio mixing might be blending sound based on dynamics or frequencies in audio.

A typical one hour doco project for me will have around 5,000 clips so the ratio of audio to picture elements is hugely different. The other great difference is data management. I have a sound effects library integrated in the Fairlight that manages 30,000 + sound clips. We don't need meta data however as clip names are descriptive so standard word searches with + - delineators enabling quick search and inline auditioning on the track before a simple Enter to paste.

If FCP X can use metadata to do similar manipulation, then editing should be a much nicer process.


Return to posts index

David Lawrence
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 4:37:59 am

[Michael Gissing] "I have spoken to software developers about the idea of using video composite layer techniques in audio. Straight opacity is a bit like mixing in audio terms but key, difference, luma type vision mixing is different. Similarly audio mixing might be blending sound based on dynamics or frequencies in audio. "

Michael, thanks for bring up the Fairlight and its innovative UI for audio. This is the closest thing I've seen that uses layers as an approach for audio compositing. I've played with it a bit but never really got deep into it. It seems interesting:

http://www.audiofile-engineering.com/waveeditor/

Also, you spoke about the role of tracks. I'm also curious what you think about ripple-mode for edits. From your POV as an audio post specialist, could you do your job if your tools only operated in ripple mode?

_______________________
David Lawrence
art~media~design~research
propaganda.com
publicmattersgroup.com
facebook.com/dlawrence
twitter.com/dhl


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:16:36 am

[David Lawrence] "Also, you spoke about the role of tracks. I'm also curious what you think about ripple-mode for edits. From your POV as an audio post specialist, could you do your job if your tools only operated in ripple mode?"

I will have a look at the software linked but in terms of ripple edit I have never used it, although it is an available option on the Fairlight. Sound post is about a fixed timeline so ripple editing is useless.

Similarly when I use FCP for online conform I also never use ripple edit as the timeline duration and sync is all locked. I do use ripple when I (very occasionally) edit video but it should be a toggle option in my opinion


Return to posts index


Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:19:40 am

[David Lawrence] "Also, you spoke about the role of tracks. I'm also curious what you think about ripple-mode for edits. From your POV as an audio post speshialist, could you do your job if your tools only operated in ripple mode?"

I will have a look at the software linked but in terms of ripple edit I have never used it, although it is an available option on the Fairlight. Sound post is about a fixed timeline so ripple editing is useless.

Similarly when I use FCP for online conform I also never use ripple edit as the timeline duration and sync is all locked. I do use ripple when I (very occasionally) edit video


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:28:37 am

[David Lawrence] "
http://www.audiofile-engineering.com/waveeditor/"


Interesting although they do make a big claim by saying they are "introducing the concept of Layers". I will download the demo and see if they are just layering Fx processing to a clip or actually using effects to mix multiple audio layers. I suspect from the limited description on the web site that they may not be doing what I was describing.

Thanks for the link. If nothing else it looks like a useful tool for integrating VTS plugins in a Mac.


Return to posts index

David Lawrence
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 6:11:46 am

[Michael Gissing] "Thanks for the link. If nothing else it looks like a useful tool for integrating VTS plugins in a Mac."

You're welcome. It seemed quite powerful from playing with the demo. The price is right, but it was more learning curve than I had time for. Curious what you'll think.

_______________________
David Lawrence
art~media~design~research
propaganda.com
publicmattersgroup.com
facebook.com/dlawrence
twitter.com/dhl


Return to posts index


Andrew Richards
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 4:37:53 am

[Michael Gissing] "We don't need meta data however as clip names are descriptive so standard word searches with + - delineators enabling quick search and inline auditioning on the track before a simple Enter to paste. "

Well, technically, clip names are metadata.

Best,
Andy


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 4:46:03 am

[Andrew Richards] "Well, technically, clip names are metadata."

My point was that in tapeless video, clip names are useless as a descriptor of content so meta data is added. Audio libraries use the name as the descriptor similar to log & capture using a name descriptor as the file name. You can read this data without digging into the file headers to read additional (meta) data.

Meta data is essential in a tapeless camera world and I would love to see the meta data descriptor transferred via an OMF rather than the clip name.


Return to posts index

Andrew Richards
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 4:55:56 am

[Michael Gissing] "My point was that in tapeless video, clip names are useless as a descriptor of content so meta data is added. Audio libraries use the name as the descriptor similar to log & capture using a name descriptor as the file name. You can read this data without digging into the file headers to read additional (meta) data."

It's an extra step, but tapeless video clips can be systematically renamed to suit what you'd like to see. FCP7 offered a similar capability during Log & Transfer. FCPX lets you do it at any time. It sounds like it would be well-worth the effort if it aids the post pipeline downstream from the editor.

Best,
Andy


Return to posts index


Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 4:59:05 am

[Andrew Richards] "It's an extra step, but tapeless video clips can be systematically renamed to suit what you'd like to see."

I also do color grading so I never advise any editor to change file names during L&T as it breaks the workflow to relink original RED media or generally confuses an offline/ online workflow. That's why I would prefer the meta data descriptor to be transferred via OMF (as an option).


Return to posts index

Andrew Richards
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:09:21 am

[Michael Gissing] "I also do color grading so I never advise any editor to change file names during L&T as it breaks the workflow to relink original RED media or generally confuses an offline/ online workflow. That's why I would prefer the meta data descriptor to be transferred via OMF (as an option)."

Ah. Does OMF support that kind of thing? I tried skimming it out of the OMF spec but couldn't suss it out. FCPXML certainly does with roles.

Best,
Andy


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:12:06 am

[Andrew Richards] "Does OMF support that kind of thing?"

No. It is up to the software generating the OMF to do that. OMF just deals with clip names.


Return to posts index

Jeremy Garchow
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 3:58:05 pm

[Michael Gissing] "That's why I would prefer the meta data descriptor to be transferred via OMF (as an option)."

I have no idea, but would OMF support this? Meaning, could I send both the "ClipName" and "UserclipName" to OMF somehow? I think it would be easy to write the translation, but OMF would have to support multiple clip naming fields. Or maybe you get an OMF and an XML of metadata.


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 10:04:07 pm

[Jeremy Garchow] "I have no idea, but would OMF support this?"

No Jeremy. OMF will only support clip name. It is an option that should be part of translating the project file to an OMF composition. The editor would have the option to chose clip name or description which would then default to meta data.

OMF is long since EOL which ironically makes it robust as an interchange format.


Return to posts index

Jeremy Garchow
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 11:30:56 pm

[Michael Gissing] "The editor would have the option to chose clip name or description which would then default to meta data."

Does Fairlight accept XML in any form?


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 11:37:08 pm

[Jeremy Garchow] "Does Fairlight accept XML in any form?"

Yes. I have been able to import FCP projects via XML for a few years, but I still prefer to use OMF as it moves and manages the media into a single file which is much easier for backup management. Also the Fairlight runs on a WIN XP platform so connecting Mac formatted drives would need MacDrive software. As I have FCP, I just emport an OMF over the network directly to the NAS unit that is the audio drive for the two Fairlights.

Fairlight will do its best to flatten the video to a single track and it will translate scaling, basic wipe and dissolve FX but obviously not plugins or speed ramps. Audio plugins are also ignored. Again, I find it easier to make an H264 reference QT.


Return to posts index

Jeremy Garchow
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 11:41:55 pm

[Michael Gissing] "Again, I find it easier to make an H264 reference QT."

That's what I do for most of the audio guys, some still prefer dv which is a pain for me! ;)

I was thinking that maybe fairlight could conform clip names from the OMF and a sequence XML (or allow for clip name choice). The problem with metadata (and Walter Soyka pointed this out) is that there's no standard. One mans User Clip Name is another man's location.

Jeremy


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 11:51:46 pm

[Jeremy Garchow] "I was thinking that maybe fairlight could conform clip names from the OMF and a sequence XML"

XML simply points to original media so that won't co-relate to OMF media which has been truncated with handles. I will be talking to the Fairlight guys who are very savvy with interchange formats but I think they will utilise the FCP X XML to allow direct interchange both in and out.

The OMF idea is more a plea to whoever does the translate to offer a naming option for the clips. I am sick of seeing DSC_102856_1233 as a file name instead of car_interior_driving.


Return to posts index

Jeremy Garchow
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Oct 1, 2011 at 12:11:44 am

[Michael Gissing] "XML simply points to original media so that won't co-relate to OMF media which has been truncated with handles. I will be talking to the Fairlight guys who are very savvy with interchange formats but I think they will utilise the FCP X XML to allow direct interchange both in and out."

Yeah, sorry, should have been more specific in an fcpxml as I am sure this will be used on some sort of FCPX OMF export.

In understand theres no tie to fcp7 omf and Sequence XML.

This brings up more great points. Thank you.


Return to posts index

Chris Harlan
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:15:23 am

Great conversation, you guys. I'm just trying to catch up now.


Return to posts index

David Lawrence
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 3:09:14 am

Picking up from the questions in Jeremy's post here:

http://forums.creativecow.net/readpost/335/16855

[Jeremy Garchow] "Let me ask you this. Do you think FCPX will ever get tracks? Do you think they will work with the trim tool? By adding tracks, don't you think we will lose some of the editing efficiency?"

Will FCPX ever get tracks? I don't know. I tend to doubt it because I believe Apple is heavily invested in the marketing and the engineering direction FCPX represents. I think it would take a lot of pressure to turn them around. Not that it's impossible -- I've pointed to what I see as evidence that tracks are integral to AV Foundation. But I also agree with Craig Seeman's observations about other parts of the AV Foundation spec. It feels like the UI is an expression of this object data model rather than the track parts of the model. That would explain many of the object behaviors as well as the constraints.

This is pure conjecture on my part but I believe that at one point, FCPX did have tracks and that there was a big internal fight for the soul of this product. The track people lost and they were the ones fired in 2010.

I do think that tracks could be brought back with a hybrid system that gives us the best of both tracked and trackless. The first step would be allowing more than one primary storyline. Don't even call it primary, just make it storyline (i.e. fixed track), and connected storyline (i.e. secondary storyline). This would solve many of my problems right away. That may be tough or impossible because of the data model. I don't know what they're doing under the hood.

Re: trim tool and editing efficiency - I think we need to break this into two related but different aspects of the magnetic timeline's behavior. 1) ripple mode 2) tracklessness

Ripple mode and trimming:

There's a lot to like about the FCPX trim tool, especially how it's context sensitive depending on cursor position. I'd love to have that in FCP7. But it's never been a big deal tapping A, R or RR depending on what I want to do.

I hit RR for ripple trim maybe 5% of the time on the actual timeline. I'm more likely to cut or trim by grabbing the clip edge then delete or shift-delete depending on if I want to ripple or not.

If I want to ripple, I often go into trim mode. It took me years to figure this out -- the trim mode in FCP from double clicking an edit is in many ways as good as AVID's legendary "Rock and roll" keyboard trimming. I use this mode to loop around the edit point making one frame adjustments on either side of the edit. It's done entirely by feel. +1, -1 +2, etc. No need for filmstrips. This is how I do 95% of my rippling. It's a very specific tool for a very specific need.

Rippling is the default mode for the magnetic timeline. This default mode is the opposite of how I edit most of the time. To return to the music metaphor:

If a clip is a note and the timeline is the composition sheet, I want to place the note anywhere I want in time and have it stay there as I build the piece. It's all about having absolute control over time and pacing by placement in space. This is intrinsic to the open timeline and tracks. With this in mind, can you see how if the system starts mashing notes together (and up and down on the staff) all by itself, or forces me to insert spacers to hold the notes apart, it might be a problem?

I'm not saying the magnetic timeline is inefficient. I think it's very efficient, but for very specific, specialized tasks that aren't the default in my usual workflow.

[Jeremy Garchow] "Also, do you think that this style of editing simply is impossible in FCPXs timeline?"

I've done it. It's possible, but the process isn't exactly what I'd call efficient (or fun).

Stay tuned, I'll say more in another post.

_______________________
David Lawrence
art~media~design~research
propaganda.com
publicmattersgroup.com
facebook.com/dlawrence
twitter.com/dhl


Return to posts index

Andrew Richards
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 3:28:32 am

[David Lawrence] "I've pointed to what I see as evidence that tracks are integral to AV Foundation. But I also agree with Craig Seeman's observations about other parts of the AV Foundation spec. It feels like the UI is an expression of this object data model rather than the track parts of the model. That would explain many of the object behaviors as well as the constraints."

I maintain that AVFoundation forces nothing on the UI. The UI is 100% abstracted from the OS frameworks that handle the actual bit-laying. Walter Soyka! Back me up!

[David Lawrence] "This is pure conjecture on my part but I believe that at one point, FCPX did have tracks and that there was a big internal fight for the soul of this product. The track people lost and they were the ones fired in 2010."

Maybe he was flat out lying, but Jobs reportedly said the layoffs were from support, not engineering. Then again, that was the infamous "awesome" email, so I'm probably not changing anyone's mind with such evidence...

Best,
Andy


Return to posts index

David Lawrence
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 3:45:14 am

[Andrew Richards] "I maintain that AVFoundation forces nothing on the UI. The UI is 100% abstracted from the OS frameworks that handle the actual bit-laying. Walter Soyka! Back me up!"

LOL, Andrew, I'm in total agreement with you and Walter on this. Sorry if my point wasn't clear. UI is always 100% abstraction. It can be anything the UI designer wants it to be. That's why many of the UI decisions in FCPX are so baffling. For me, they only start making sense if you imagine that the FCPX UI was designed by the engineers. The constraints, inconsistencies and mental models in the UI feel like they're driven by an engineering data model, not an understanding of users.

That said, I do wonder how much of the UI is baked in. Why for example, is their solution for adding transitions to connected clips, to turn them into secondary storylines? My hunch is that the data model demands it.


_______________________
David Lawrence
art~media~design~research
propaganda.com
publicmattersgroup.com
facebook.com/dlawrence
twitter.com/dhl


Return to posts index

Andrew Richards
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 4:34:45 am

[David Lawrence] "That's why many of the UI decisions in FCPX are so baffling. For me, they only start making sense if you imagine that the FCPX UI was designed by the engineers. The constraints, inconsistencies and mental models in the UI feel like they're driven by an engineering data model, not an understanding of users."

Yeah, the thing is, I do think they were modeling the UI after a particular data model, just not one made mandatory by the structures of the underlying media-handling frameworks. As we've discussed at length, they anchor clips to other clips rather than to a time frame. Time is kept, but isn't the structure for keeping track of what goes where. Time is just a meter for the music, if you'll pardon my hijacking of the metaphor you opened this thread with.

I think the broad idea was to find a way to capture a more explicit expression of editorial intent. In an open tracked timeline, intent is all in the mind of the user. As far as the software is concerned, that lower third only happens to sit atop the right bit of talking head. That music just happens to come in at the right beat in the action above it. Tracks 1 and 2 are dialog, 3 and 4 are music, and 5 and 6 are SFX, but the NLE doesn't know that.

In the magnetic timeline, the software is actually told what clips are married to other clips. I think this is all born out of a philosophy that the software should try to glean as much actionable metadata as it can without asking the user to make ancillary inputs strictly for the sake of metadata. They can't eliminate all manual entry of metadata, but they can make it necessary and once metadata is ubiquitous, they can start automating things.

I also agree this all reeks of engineering, and that is probably why it appeals to me so much. From an engineer's perspective, everything is input/output, and you can satisfy all I/O requirements with metadata. Roles are an excellent example, they get you to the same ends as audio track conventions. Users on the other hand, are concerned with technique as much as they are with I/O (if not more so). The I/O needs to be there to get the job done, but the technique is craft, and craft is sacred.

I imagine the evolution like this: they wanted to capture explicit editorial intent, so they think up clip connections. These could work on tracks, but they make collisions much more difficult to manage while maintaining track roles if you have a stack of staggered connected clips you want to move in concert. So get rid of the tracks! OK, but what about the roles we conventionally assign to tracks like DME? Explicit metadata tags! So now we can capture explicit intent, prevent collisions while moving these stacks of explicitly connected clips, capture more explicit intent in the form of roles metadata, and then use that to route output. Cut! Check the gate!

I have no ideas for how capturing all this explicit intent might be exploited for additional functionality, but that's what it looks to me like they are aiming for.

Best,
Andy


Return to posts index

David Lawrence
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 4:56:33 am

[Andrew Richards] "I think the broad idea was to find a way to capture a more explicit expression of editorial intent. In an open tracked timeline, intent is all in the mind of the user. As far as the software is concerned, that lower third only happens to sit atop the right bit of talking head. That music just happens to come in at the right beat in the action above it. Tracks 1 and 2 are dialog, 3 and 4 are music, and 5 and 6 are SFX, but the NLE doesn't know that."

I guess it depends on how we define "editorial intent". I would argue that editorial intent is explicit and intrinsic to spacial positioning in the timeline. When I look at a timeline I read editorial intent like a musician reading sheet music.

The idea that the software might understand the edit is interesting, but I'm hard pressed to think of any examples where it would be useful. Can you describe an example of the potential value?

_______________________
David Lawrence
art~media~design~research
propaganda.com
publicmattersgroup.com
facebook.com/dlawrence
twitter.com/dhl


Return to posts index

Andrew Richards
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:21:06 am

[David Lawrence] "I guess it depends on how we define "editorial intent". I would argue that editorial intent is explicit and intrinsic to spacial positioning in the timeline. When I look at a timeline I read editorial intent like a musician reading sheet music."

Yes, you know the intent. Your collaborators know your intent based on conventions you share. But the software doesn't know anything more than that clip is on that track from that timecode value to that other timecode value.

[David Lawrence] "The idea that the software might understand the edit is interesting, but I'm hard pressed to think of any examples where it would be useful. Can you describe an example of the potential value?"

Well no, I said I have no ideas for how capturing all this explicit intent might be exploited for additional functionality. Maybe I'm way off and all they wanted to do was simplify rearranging chunks of an edit compared to how you'd do it with the arrow tool in FCP7.

I've got a few good ideas for roles metadata, but no epiphanies for connected clip relationships. Maybe something to do with flexibility in how the timeline is visualized (my old saw)? Nothing much springs to mind for me either.

Best,
Andy


Return to posts index

Bill Davis
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 6:30:48 pm

Maybe my thinking is too simplistic, but it seems to be that before X, all editors worked on stacks of separate entities.

In X, the stack is it's own entity.

That is the fundamental abstraction they shifted.

Links (metadata) reach into the connected clip entity. But the entity remains discrete. And as such, you're less assembling a timeline of disparate, discrete entities, than you are building new entities that become a flexible library of assets in and of themselves.

At some point, if they are linked by metadata tags - those discrete entities may form foundational building blocks that can be mixed and matched (and more importantly SEARCHED and RETRIEVED) based on their tags.

Right now, there's no way to "reach in" to a FCP legacy timeline and link to it's building blocks. With connected clips, it should be trivial at some point in the future.

That could be a useful new holistic overview that would work very well in connected editing distribution models in the future.

Just imagining.

"Before speaking out ask yourself whether your words are true, whether they are respectful and whether they are needed in our civil discussions."-Justice O'Connor


Return to posts index

Jeremy Garchow
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 7:17:26 pm

[Bill Davis] "Maybe my thinking is too simplistic, but it seems to be that before X, all editors worked on stacks of separate entities.

In X, the stack is it's own entity.

That is the fundamental abstraction they shifted. "


Yes. Or as I like to say, the clip relationships have changed. The vertical relationships are now more locked together than the horizontal relationships. Horizontal clip relationships are now more user defined. If you need two clips to be next to each other on the same layer, you select them and make a storyline, compound clip, whatever. In the FCPX timeline, there are visual cues to the editor letting them know what is going on, i.e. "This part of the timeline is different". it is this language that is certainly different from other NLEs.

[Bill Davis] "Right now, there's no way to "reach in" to a FCP legacy timeline and link to it's building blocks. With connected clips, it should be trivial at some point in the future."

And most use tracks this way. It forces this sort of organization up front, but it hits limits pretty fast.

[Bill Davis] "Just imagining."

Shame on you.


Return to posts index

Walter Soyka
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 6:57:39 pm

[David Lawrence] "I guess it depends on how we define "editorial intent". I would argue that editorial intent is explicit and intrinsic to spacial positioning in the timeline. When I look at a timeline I read editorial intent like a musician reading sheet music."

I think that the there are a couple important new concepts on editorial intent in FCPX that we generally take together, but should really consider separately (as they could be implemented independently):
  • FCPX stores information about clip's relationships to each other (clip connections), and uses it during some editorial operations.
  • FCPX has added containers (storylines) which formalize the relationship of a series of clips connected in time (horizontal position), arguably at the expense of context (vertical position).
  • FCPX works in relative time.



[David Lawrence] "The idea that the software might understand the edit is interesting, but I'm hard pressed to think of any examples where it would be useful. Can you describe an example of the potential value?"

Clip connections strike me as useful. The can explicitly define a relationship that we can only imply with direct spatial positioning on an unconnected open timeline. With clip connections, you can link a graphic to a particular point in dialogue, or you can link a sound effect to a particular visual. That relationship is defined once, then honored for subsequent editorial operations (by rippling). Without clip connections, the editor must read the timeline, recall or deduce the connection from spatial positioning, and make the correct selection in order to preserve the relationship through an edit. Also, this process must be repeated every single time an edit would affect the relationship.

I would have loved to have seen clip connections on an open timeline, without containers (as implemented in FCPX, destroying hard tracks) and relative time. You'd have to sometimes add hard tracks to preserve the rest of the edit, but that's why I'd love to see tracks live in groups. I think this is pretty much how Michael Gissing's description of layered clips in a DAW works.

My main issues with the trackless, self-collapsing timeline are much more visual, but I think that will would be better in a separate thread.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Walter Soyka
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 7:15:40 pm

After a little more thought, I'd argue that an NLE that "understands" editorial intent knows when to treat clips individually, when to treat them as a group, and how to define what should and should not be included in the group.

Does FCPX get this part right?

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Jeremy Garchow
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 7:27:24 pm

[Walter Soyka] "After a little more thought, I'd argue that an NLE that "understands" editorial intent knows when to treat clips individually, when to treat them as a group, and how to define what should and should not be included in the group.

Does FCPX get this part right?"


It certainly attempts to and also allows user to define these relationships very easily. I think it also lets disparate groups interact without disturbing their relative position.

[Walter Soyka] "...but that's why I'd love to see tracks live in groups. I think this is pretty much how Michael Gissing's description of layered clips in a DAW works."

I think with finesse, this is exactly what Roles will accomplish (and they already kind of do) . Now, and I know this is kind of nuts thinking, but if you could sort the timeline by roles (or "bus") then the visual defects that people might be having may be alleviated at least to some degree.


Return to posts index

Franz Bieberkopf
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 8:17:53 pm

[Walter Soyka] "Clip connections strike me as useful...I would have loved to have seen clip connections on an open timeline, without containers."

Pro Tools calls it grouping. You can highlight clips and group them (and conversely ungroup them) at will. They then function as a single block of media.

... and in tracks!

I assume there are other examples out there too.

[Walter Soyka] "My main issues with the trackless, self-collapsing timeline are much more visual, but I think that will would be better in a separate thread."

... my vote for you to start this thread.

Franz.


Return to posts index

Franz Bieberkopf
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 4:38:22 am

David,

... a great theme for discussion.

Is there a short summary - that thread you link to is very long and meandering with lots of cross-topics. I couldn't find what might be the start of this.

But I wanted to chime in first to suggest that musical composition is more an analog of editing (rather than metaphor).

It has been my long sad lament that editing software is primarily viewed as a visual realm (by both designers and users). (Already in this thread it's been shunted in that direction.)

I see no reason not to expect most of the functionality of a DAW in an NLE.

Franz.


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 4:52:12 am

[Franz Bieberkopf] "I see no reason not to expect most of the functionality of a DAW in an NLE."

Particualrly as DAWs like Fairlight can import FCP7 XMLs and can do basic cuts & dissolve video edits. You can even edit H264 and mpeg2 natively in Fairlight.


Return to posts index

Bill Davis
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:25:57 am

I know I'm going to get yelled at for this, but so be it.

I came out of audio (radio actually) and didn't transition to video work until long after I'd spent time doing basic multi-track audio work, not for music, but for spot work in broadcast.

During that part of my career, the quality and precision of sound editing was EVERYTHING to me. I obsessed over signal to noise ratio and wept tears when ping-poinging tracks added a few db of hiss to a complex mix.

And I tried to carry that audio obsessiveness along with me when I started working primarily in video - but I found I simply could not do it. And I've always wondered why. The best answer I can come up with is the layered complexities of visual work made it critically important that I keep my attention on the visuals and accept a standard for audio that was less than what I was accustomed to when it was my primary concern. I hated this reality, but I found I had not choice but to accept it.

I came to see that most of my audiences were engaged in the visual realm first and foremost - and that the soundtrack - while absolutely essential to the experience and clearly the vehicle that carried MOST of the communication heavy lifting in every project I did - did NOT need to be absolutely pristine in order to satisfy the audience. It needed to reach a solid, professional standard - and anything that damaged intelligibility was anethma, but an audience engaged in the visuals simply did not DEMAND the same level of audio precision that they did in audio only production.

I'm NOT for one second saying that sound is not critical to the video experience. It obviously is among the MOST critical aspects. But I also believe that when it accompanies picture, the fact that the audience is presented with a complex mix of visual and sound, their processing of the audio information gets less of their "concentration" than it would otherwise.

I've seen this time and time again in production. People will watch a movie on DVD coming out of a cheap korean combo DVD-TV set and be every bit as "engaged" as they are sitting in a Dolby 5.1 surround theater.

I suspect this reality is why audio gets the contemptible treatment it often does in too many video productions.

Again, please, I'm NOT arguing for poor audio quality standards at all. I'm saying that audio for video has NEVER been equal to standalone audio. I've never been on a movie set where someone is running every mic into a Grace pre-amp and attempting to push for music industry studio audio standards - and think there's a reason.

So to think that an audio for video system should start with a DAW approach and simply build a video layer on top of that is a poor idea to my thinking. It would be excellent for those with a sound obsession, but not for the general market.

Quality directors like Walter Murch have certainly come out of sound. But a whole lot more of them have started with sensitivities to other parts of the movie making process. Sensitivity to acting, to classical storytelling, and perhaps sadly to pyrotechnic techniques have been equal or even more common ways for successful directors to cut their teeth.

Sound deserves every ounce of care and craft that can be brought to it. But to put it at the dead center of the requirements list for an NLE is, IMO, not the best path to general success. If so, I suspect that VEGAS and FCP would have swapped market share long ago.

For what it's worth.

(stopping to put my flak jacket on)






I believe this is because most video work is done at a consumption level below "archival" requirements. While a feature film will potentially last for decades, most visual content has a much shorter functional life.

"Before speaking out ask yourself whether your words are true, whether they are respectful and whether they are needed in our civil discussions."-Justice O'Connor


Return to posts index

Andrew Richards
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:35:44 am

[Bill Davis] "I've seen this time and time again in production. People will watch a movie on DVD coming out of a cheap korean combo DVD-TV set and be every bit as "engaged" as they are sitting in a Dolby 5.1 surround theater. "

Yeah, but they are getting a crappy image on that thing too. Story and characters are what the audience connects with, not signal to noise ratios or gradients with no visible banding. You're touching on a long-standing meme in which fine quality is lost on the consumer, much to our collective chagrin. Where's Terence Curren? He has a lot to say on that matter.

Best,
Andy


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:38:07 am

[Bill Davis] "So to think that an audio for video system should start with a DAW approach and simply build a video layer on top of that is a poor idea to my thinking"

Although DAWs predeceded NLEs, they were always designed differently and for good reason, mostly being that sound post is a skill set outside most editors experience so the job was always going to a DAW to be finished, so the focus has rightly been on video.

What I don't get however is why NLEs didn't look at the problems that DAWs solved years ago. My particular interest is why clip collision and ergonomics of using dedicated controllers never made it into NLEs like FCP.


Return to posts index

Bill Davis
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 6:10:52 pm

Michael,

While I heartily agree with your (sound post is a skill set outside most editors experience) what I find most fascinating is how little regard outsiders typically have for ANY skill set outside their own in this era.

I'm working more and more with still photographers who are coming into video since their DSLRs allow them to shoot motion work along with their stills.

Exactly like sound oriented people are aghast at how "devalued" their skills are in some video production situations, the still folks, while appropriately placing GREAT value on skills similar to their existing ones (lighting and composition) often can't get their heads around the complexities of sub-disciplines like directing actors, blocking for motion, camera movement and scene pacing - and are MOST challenged when they have to approach totally alien disciplines such as sound for picture.

In the best possible world, everyone will respect everyone's expertise - neither dismissing nor elevating the importance of quality talent in any area. But it's also going to be a part of the modern producers job to understand when "enough is enough" in pursuing perfection, since the market is making stern judgements on what can be accomplished given today's stressed to breaking budgets.

"Before speaking out ask yourself whether your words are true, whether they are respectful and whether they are needed in our civil discussions."-Justice O'Connor


Return to posts index

Michael Gissing
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 10:16:28 pm

[Bill Davis] "While I heartily agree with your (sound post is a skill set outside most editors experience) what I find most fascinating is how little regard outsiders typically have for ANY skill set outside their own in this era."

Conversely though Bill, post people like me do appreciate the editors and directors skills. As a grader & sound mixer, I love great photography. I can't stand mixing to ungraded images and I hate the fact that Color had no sound so grading is done mute.

Editors often give me thumbs up during a mix because we recognised a cut that just wasn't working and made it work by sound.

But this thread is more about how the tools should allow for best manipulation of both sound and video at the edit stage and allow the job to progress to post where budgets allow. I get the shrinking budget and the perceived and real need to finish in a single tool, regardless of how far it pushes skill sets and comfort zones. For that reason I wonder at how the magnetic timeline/ roles concept in particular is helpful or detrimental. There is much that can be learned from DAWs because they are, in many ways, more mature and have tried concepts and methodologies. I would love a dollar for every time a picture editor has watched me on the Fairlight and said "why can't video edit software be like that"


Return to posts index

David Lawrence
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:10:25 am

[Franz Bieberkopf] "Is there a short summary - that thread you link to is very long and meandering with lots of cross-topics. I couldn't find what might be the start of this."

Franz -- thanks, yeah that other thread was way too long and way off topic. The sub-thread started with a question I asked Jeremy. I asked if he thought the magnetic timeline would make sense for a DAW. His answer was a lightbulb moment for me. I finally understood why we think of tracks in an NLE so differently.

Here's a link to my reply:

http://forums.creativecow.net/readpost/335/16840

This lead to a great conversation that's worth digging thru if you can find the thread. It really deserved its own topic so I moved it up here.

_______________________
David Lawrence
art~media~design~research
propaganda.com
publicmattersgroup.com
facebook.com/dlawrence
twitter.com/dhl


Return to posts index

Franz Bieberkopf
Re: NLEs, DAWs, Tracks and Story-centric Workflows
on Sep 30, 2011 at 3:14:09 pm

Thanks David.


Wow. I have lots to say. I'll try to be as brief as I can an focus on the main issue that has been with me since the introduction of the new software. Keep in mind that I haven't used FCPX (and please correct if my understandings of the software are wrong).

Lots of bells ringing but I'll start by picking up on this:

[Andrew Richards]
"Story and characters are what the audience connects with."

Maybe it is for the type of material you work with. This might even be true for most editing in general (though I doubt it). But the question at hand is - should tools be designed around that model?

The main issue is precisely those words floating in this thread (and I think in previous ones too): "Editorial intent".

In order for a designer to start designing around my "editorial intent", they have to start making a lot of assumptions about what editing is - or more specifically what kind of editing I want to do. In other words, they have to start designing around more formulaic models of what editing is. I think the long term implications of this are clear - taken to its conclusion, this will mean more formulaic editing and less creative approaches. (The irony here is interesting to me).

In short, FCPX seems to assume:
- that I have a "primary through-line" against which I am judging and adding other things
- that this is, mostly, a "storyline" of some sort

This seems to be developed for and work best with the most formulaic of work (using as three of the best examples of this):
- music video type productions, where music dictates the edit
- interview with b-roll type formulations
- dialog with cut-away type formulations

(A more thorough bit would examine the assumptions of an "open timeline" ... anyone else want to jump on that?)

Strictly speaking, I think the only intent that can be assumed is that an editor will wish to put sound and image together in time. All else beyond that starts to get ... a bit messed up.

I think that's all I have time for right now ...


Franz.


Return to posts index

Chris Harlan
Re: NLEs, DAWs, Tracks and Story-centric Workflows
on Sep 30, 2011 at 3:29:37 pm

[Franz Bieberkopf] "Strictly speaking, I think the only intent that can be assumed is that an editor will wish to put sound and image together in time. All else beyond that starts to get ... a bit messed up.
"


Yup.


Return to posts index

Andrew Richards
Re: NLEs, DAWs, Tracks and Story-centric Workflows
on Sep 30, 2011 at 6:28:22 pm

[Franz Bieberkopf] "Maybe it is for the type of material you work with. This might even be true for most editing in general (though I doubt it). But the question at hand is - should tools be designed around that model?"

I base that statement on my observations of regular people consistently truly enjoying content despite relatively abysmal reproduction. From highly compressed SD stretched out onto an HDTV to badly encoded YouTube pirated TV and movies, regular people just don't have high standards when it comes to signal quality. A bad script or bad performance is much more of a turn off to them than crunchy artifacting on a dissolve or flat dynamic range on cruddy speakers.

[Franz Bieberkopf] "In short, FCPX seems to assume:
- that I have a "primary through-line" against which I am judging and adding other things
- that this is, mostly, a "storyline" of some sort

This seems to be developed for and work best with the most formulaic of work (using as three of the best examples of this):
- music video type productions, where music dictates the edit
- interview with b-roll type formulations
- dialog with cut-away type formulations"


Are these formulas? Or conventions? The only difference between the magnetic timeline and the open timeline as I see it is how the user interacts with them. Both metaphors wind up producing a composite video stream synched to audio along a regular time scale. I don't see any technical barriers to being as creative in a magnetic timeline as you can be in an open track timeline. Default ripple and non-spatial organization may rub you the wrong way (like, a lot), but they are not exclusive of creativity. They are a different means to the same end.

Best,
Andy


Return to posts index

Franz Bieberkopf
Re: NLEs, DAWs, Tracks and Story-centric Workflows
on Sep 30, 2011 at 6:54:21 pm

[Andrew Richards] "I base that statement on my observations of regular people consistently truly enjoying content despite relatively abysmal reproduction."

While I am a bit alarmed by your dismissal of "irregular people" and their reactions, more to the point - I think you've missed my point. I wasn't discussing reproduction quality but your focus on character and story as essential and fundamental to editing. (And therefor fundamentally important to the "paradigm").

A pointed and illustrative example would be the work of Paul Sharits - it's amusing to think of him assembling a "storyline" on his editing machine of choice ...

[Andrew Richards] "Are these formulas? Or conventions?"

Though I think one can make a distinction between formulas and conventions, you could use the words interchangeably in what I have stated and my point still stands.

I might be better to summarize by saying that the degree to which my editing tools assume what I want is the degree to which they start to limit my endeavors.

You're right in terms of ends, but I am interested in how FCPX shapes the process.


Franz.


Return to posts index

David Lawrence
Re: NLEs, DAWs, Tracks and Story-centric Workflows
on Sep 30, 2011 at 8:38:47 pm

[Franz Bieberkopf] "You're right in terms of ends, but I am interested in how FCPX shapes the process."

I agree with statement fully. I've just made some screen grabs and will demonstrate in my next post.

Jeremy, Andrew, Walter - I'll address your questions and raise some new ones in that post. We can do that in this thread or we can start a new topic, I'm fine either way, let's just link if we jump threads to keep continuity.

We're digging into the core issues. This is the key stuff and this conversation is great. A lot is interrelated so I really want to keep this thread going even if we move to another topic heading. Give me an hour or so to prep. Stay tuned...

_______________________
David Lawrence
art~media~design~research
propaganda.com
publicmattersgroup.com
facebook.com/dlawrence
twitter.com/dhl


Return to posts index

David Lawrence
Re: NLEs, DAWs, Tracks and Story-centric Workflows
on Oct 1, 2011 at 12:39:18 am

Folks, apologies for the delay. I got a bit deeper than I expected and took a bit more time than I thought. It turned into something of a short article so I went ahead and started a new related topic. I think this will also be a good place explore some of the examples Walter mentioned.

Let's keep this thread going but keep the emphasis on audio, music and DAW vs NLE UIs.

Please join me to continue discussing tracked vs. trackless timelines, editorial intentions and spatial workflows here:

The Open Timeline and Spatial Workflows -- An Example

p.s. Jeremy -- I promise I'll get to all of your questions ;)

_______________________
David Lawrence
art~media~design~research
propaganda.com
publicmattersgroup.com
facebook.com/dlawrence
twitter.com/dhl


Return to posts index

Chris Harlan
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:13:50 am

[Franz Bieberkopf] "It has been my long sad lament that editing software is primarily viewed as a visual realm (by both designers and users). (Already in this thread it's been shunted in that direction.)
"


It's funny, Franz. It really depends, I think, on whether you come from television or film. Television gets used quite a bit differently, and amongst many television producers, writers, and editors I believe there is a general recognition that audio is often the stronger component.


Return to posts index

Andrew Richards
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 5:27:05 am

[Chris Harlan] "It's funny, Franz. It really depends, I think, on whether you come from television or film. Television gets used quite a bit differently, and amongst many television producers, writers, and editors I believe there is a general recognition that audio is often the stronger component."

A wise man once told me that people don't watch TV, they listen to it. He referred me to the sometimes sloppy visual continuity in sitcoms that rely more on audio timing to get a laugh.

Best,
Andy


Return to posts index

Bill Davis
Re: NLEs, DAWs, Tracks and Audio-centric Workflows -- Continuing the Conversation...
on Sep 30, 2011 at 6:01:15 pm

It seems to me that this thread has been largely a look at the “foundational thinking” behind the entire idea of the NLE interface.

Which reminds me of how Walter Murch set out to include similar foundational thinking with “In the Blink of an Eye” - examining some of what our brains do as they “edit” our experiences.

So what are the essential differences in function between sight and sound in humans?

Obviously humans developed many skills for species survival – but can they be prioritized? Is there a hierarchy to our senses? And if so, does that inform us about how we should weight content in our editing work?

It seems to me that danger has often been HEARD first (particularly since sound, unlike sight is essentially omni-directional), but the next, instinctive step is to use sight to analyze the threat potential.

When we see threats first, we listen, but having already identified that threat, we don’t listen so acutely because we already know much about the nature of the threat.

If we hear first, we universally look if we are able.

We value visual recognition over auditory because it gives us superior information about the world.

So it seems to me, as video artists, we follow this pattern when it makes sense. We value the visual information stream ahead of the auditory one – simultaneously understanding that the picture of a bear in the woods will always be incomplete if we can’t judge if she’s growling.

Interesting stuff.

"Before speaking out ask yourself whether your words are true, whether they are respectful and whether they are needed in our civil discussions."-Justice O'Connor


Return to posts index

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
© 2017 CreativeCOW.net All Rights Reserved
[TOP]