disguise-one / hap-encoder-adobe-cc

Community-supported exporter of HAP codec family for Adobe CC applications
Other
169 stars 15 forks source link

Improve export time - can final copy / delete be removed #20

Closed GregBakker closed 4 years ago

GregBakker commented 5 years ago

From @exscriber:

Other thing for improve encoding time: get rid of post-render copy step: https://forums.adobe.com/message/10559705#10559705 If it ever possible...

GregBakker commented 5 years ago

That 'post-render copy step' is likely being done by Adobe's file api, which we're wrapping and using for cross-platform support [also I assume it plays nice with CC]. If we weren't using that, they'd have no way of locating the file - we could call it whatever we like. Adobe's not handling the container format, so there's no reason for a copy step there [as per the referenced discussion]. Could be that they are doing it to disallow files from broken exports - but why then copy the file, as opposed to just renaming it?

exscriber commented 5 years ago

Issue happens after closing file from plugins MovieWriter, so Adobe try to do something with complete mov file. (during copy phase) It`s not possible to hide file by extension: *.111 not help.

Random guess: it can be related to Adobe metadata. (Because no *.xmp sibling file created) Random start point: PrSDKExport.h/exDoExportRec->reserveMetaDataSpace

GregBakker commented 5 years ago

Have done a fair bit of work trying to get around this.

The copy-step is introduced because Adobe CC is scanning the created .mov and adding its XMP metadata. Disabling metadata (from export settings, below the settings from the preset but not inside them) avoids the issue.

Other people are having this problem as well: https://forums.adobe.com/message/10559705#10559705

Adding an XMP_ atom to the file, and spacing it out to include reserveMetaDataSpace as pointed out by @exscriber but this was not then picked up by the enclosing framework. It still did the copy.

Avoiding Adobe file APIs also still triggered the copy.

Adobe creation of .mov / .mp4 (say with h264) do not have this problem, so internal to the system there's a way they avoid it.

We probably do want to include the metadata in the exported file so as to play well with the rest of the ecosystem.

In the next release we'll add a note that this performance issue may be avoided by disabling metadata.

We've reached out to Adobe for assistance in resolving this issue.

bangnoise commented 5 years ago

@GregBakker what was the final atom layout (from the exporter, before the enforced re-write) when you used reserveMetaDataSpace? Where and how were you reserving the space? I see their exporters reserve a large amount of space (more than reserveMetaDataSpace asks for) between moov and mdat - eg a very short test to QuickTime ProRes produced

ftyp
moov 13090 bytes including XMP
free 1035438 bytes
wide
mdat

An ample free after the moov doesn't cause them to rewrite the moov in place?

GregBakker commented 5 years ago

TLDR: that's correct, working out how much space to give it is the trick.

I can confirm through testing that's absolutely the case. I've made more progress on this (also with input from Adobe), and CC will notice if more space has been allocated in the in the header - and not do the post copy if it's there.

I can also get ffmpeg to add this space - including an amount for metadata as well - but it requires being told in advance how much space it needs for a moov atom it's generating, in accordance with a recipe that it doesn't publish via any api.

ffmpeg / libavformat can also be told to put the moov atom at the start either by giving it the space, or by post-shuffling the atom to the start. In the first case if it doesn't have enough space it hard fails; in the second it's similar to what's happening at present; just done inside ffmpeg [although the shuffle is done in place].

My current approach, not quite complete, is calculating in advance how big the moov atom will be - tedious, but not too hard. Should we update ffmpeg / libavformat we'll need to carefully watch changes to movenc.c inside, but I don't see a better way at present.

libavformat seems not designed to be given a full 'plan' of frames in advance, and so we need to either use something that can, or do the extra legwork.

Nearly there with this :)

bangnoise commented 5 years ago

The moov_size option to libavformat's movenc is a maximum, so I don't see why you have to calculate the precise size - am I missing something?

GregBakker commented 5 years ago

That's true - and I'm not trying to calculate a precise size, just something big enough.

A short movie might have a moov atom of a few kb. A 10 minute 24fps movie with sound has a 176kb moov atom. The moov atom of a 30min 60fps 5.1 channel movie (if we ever support that) would presumably be larger. We could make the call that the largest movie we support will always have (say) a < 5Mb moov chunk, and choose to add 5MB to every 5s movies create - but it's not too hard to trawl through movenc and do a simple > x calculation based on number of frames and presence of audio, erring on the side of larger. Nearly done in fact :)

Some details: The moov atom varies in size according to how long the media tracks are. For it also varies a bit in size depending upon where it's placed in the file.

libavformat (in movenc.c) errors out at line 6608 if the size isn't big enough. If you reserve space for it, it actually writes the moov block after it's written the mdat, by seeking back in the file. The way it knows how much it's written is by recording the file position when it starts that atom, writing it, then looking at the new file position to see if it overwrote the mdat. It then rewinds, and updates the moov atom size in place.

bangnoise commented 5 years ago

Does the Adobe XMP writer reject a file with layout

ftyp
mdat
moov
free

?

Presumably that is what libavformat writes by default (but minus the free) and if the XMP stage will accept it it would be much simpler for us to generate.

GregBakker commented 5 years ago

That is indeed what we've been writing up to now. If we don't reserve space libavformat puts the moov atom at the end.

Adobe CC doesn't reject it, but it does make a copy of it in place with the moov reordered to the top. This is the part that takes >8 minutes for a 21Gb file on a slow disk. It always does this unless it finds a moov of sufficient size with an internal xmp_ atom also of sufficient size.

Disabling metadata also avoids this - but I've found no way of doing this in the preset or via the exporter api. 'Metadata' isn't really the problem though. Putting the 'moov' at the top is incidental to metadata; this feature is 'fast start' which helps playback start while streaming. Irrelevant for our use case, but unfortunately is forced upon us, causing the whole problem in the first place.

bangnoise commented 5 years ago

Yep, I was just making sure you had tried that ordering plus the extra free atom as if the XMP processor did accept it, it would save us (you!) a lot of work. It sounds like you have tried and it doesn't (for any useful sense of "accept") so I'll shut up now.

GregBakker commented 4 years ago

This is fixed in current foundation [and is likely fixed in Hap encoder 1.0.9 prerelease], is definitely fixed in upcoming 1.1.0 release.