owncloud / ios-legacy

:iphone: iOS app for ownCloud
https://itunes.apple.com/app/owncloud/id543672169
GNU General Public License v3.0
626 stars 690 forks source link

How To support Apple's Live Photo? #578

Open satifanie opened 8 years ago

satifanie commented 8 years ago

I had tried, but the live phone after uploaded turned to be a normal one when I downloaded it again . Can live photo be supported?

javiergonzper commented 8 years ago

Hello @satifanie It is a good functionality. We put it on the backlog to work on it on the future. Thank you for your feedback!

javiergonzper commented 8 years ago

Reading the documentation: https://developer.apple.com/library/prerelease/ios/releasenotes/General/WhatsNewIniOS/Articles/iOS9_1.html#//apple_ref/doc/uid/TP40016572-DontLinkElementID_2

Looks like that if we export the photos on the Photo Live format the photos are not JPG or PNG so the only devices that could see the pictures will be other iPhones 6s or 6s Plus... I think that it is a bad idea support Live Photo :disappointed:

rwilliamsit commented 8 years ago

I think it's possible. A live photo is comprised of two simple files. A jpg and a mov file. There are iOS apps out there that seem to already do this (eg. PhotoSync, which actually claims to support uploading to ownCloud)

Please note: I am no expert here and I haven't used PhotoSync - I just have been doing some research on this topic because I am hoping to cancel my apple iCloud subscription after sorting out these types of challenges.

Anyway, this is what I found about the live photos: The two files are related together with an asset identifier in the form of a a UUID as a string in the metadata.

1) The JPEG; must have a metadata entry for kCGImagePropertyMakerAppleDictionary with [17 : assetIdentifier](17 is the Apple Maker Note Asset Identifier key).

2) The Quicktime MOV is encoded with H.264 at the appropriate framerate (12-15fps) and size (1080p). This MOV must have: Top-level Quicktime Metadata entry for ["com.apple.quicktime.content.identifier" : assetIdentifier]. If using AVAsset you can get this from asset.metadataForFormat(AVMetadataFormatQuickTimeMetadata)

Timed Metadata track with ["com.apple.quicktime.still-image-time" : 0xFF]; The actual still image time matches up to the presentation timestamp for this metadata item. The payload seems to just be a single 0xFF byte (aka -1) and can be ignored. If using an AVAssetReader you can use CMSampleBufferGetOutputPresentationTimeStamp to get this time.

The assetIdentifier is what ties the two items together and the timed metadata track is what tells the system where the still image sits in the movie timeline.

lobeck commented 7 years ago

By now this became feasible again, as Apple now released a SDK to embed Live Photos into Websites - https://developer.apple.com/live-photos/

The most important thing is, that both files are uploaded, the image and the adjacent movie file. If they are not modified, uploading them as-is is sufficient. In case there's a conversion, the metadata needs to be adjusted.