Open arikrupnik opened 4 years ago
I've gained some insight into FCP X's fcpxml format by reading the official Apple documentation for the format and trial and error. https://developer.apple.com/library/archive/documentation/FinalCutProX/Reference/FinalCutProXXMLFormat/Introduction/Introduction.html
Similarly the FCP xmeml format is openly documented although I believe that Adobe introduced some variations in its use.
Despite all this, I believe you might avoid some frustration by first taking a look at OpenTimelineIO, which is tackling this interoperability problem with more resources, and growing industry support.
Thank you for adding the links, @RebelPhoton . fcpxml documentation is massive, and I have no access to a FCPX installation for testing. Having a known-good sample to work from is a great aide to official documentation.
OpenTimelineIO looks very promising. Do you have any direct experience with it?
I haven't used it yet, and I regret it, because all the work i did last year on FCPX pipelines is not of much use now that I'm working with Premiere. I'll dive into it sooner or later.
Would be curious to hear about the work you did with FCPX pipelines.
How hard would this be to implement into LTCSync, I have been trying to find a way to either put the wav files into a mov container with videos or have an XML that then syncs tracks on import. I use resolve majority but want to be able to have a system where anyone could use linear timecode for syncing.
@Everillangel , what operating system are you on? How comfortable are you with command line?
If you can run ffmpeg
on the command line, try this command and tell me if it gives you want you need (adjust video frame rate etc., as necessary):
ffmpeg -f lavfi -i color=c=black:s=640x480:r=24 -i fille.wav -c:a copy -shortest file.mov
If you can run
ffmpeg
on the command line, try this command and tell me if it gives you want you need (adjust video frame rate etc., as necessary):ffmpeg -f lavfi -i color=c=black:s=640x480:r=24 -i fille.wav -c:a copy -shortest file.mov
Hi, I am comfortable with ffmpeg. this is not quite what I meant. I am looking to either sync in a container like mov which supports multitrack wav or produce an xml that will sync everything. Your current method is kinda hard to use surely there is a more efficent way. I know this is a free piece of software to go along side a piece of hardware. once you decode the timecode track you could easily sync it and put it in a container for less hassle.
MOV supports multitrack audio, but not multiple video streams. This is an easy implementation, but hard to communicate to users who have multi-camera shoots.
It can simplify post workflows for Final Cut and Premiere users if we could generate XML timelines which they could import directly into the NLE instead of using unfamiliar padding files.
DaVinci Resolve offers valuable insight into timeline exports, and ready examples of such files. I include here two such files, FCP XML (ironically for Premiere) and FCPXML for Final Cut Pro X. Both files contain the exact same multicam timeline, with two .mov files in it.
file.fpcxml:
file.xml