StandardCyborg / StandardCyborgCocoa

Everything you need for 3D scanning on iOS
https://www.standardcyborg.com
Other
143 stars 49 forks source link

Poor tracking and scan quality on iPhone 14 #32

Closed XkRecarens closed 11 months ago

XkRecarens commented 1 year ago

Results when scanning with an iPhone 14 and older iPhones are very different under the same conditions and environment. In the specifications they say that the iPhone 14 has an opening angle of f1.9, this means that more light enters and you have less depth of field. Does anyone know how to correct this problem? I've tried different AVCapture video_device formats but I still haven't got an answer.

scratchthatitch commented 1 year ago

Are you referring to the spec change of the true depth sensor? Or the camera, because I believe they will be independent systems.


From: XkRecarens @.> Sent: 04 April 2023 13:57 To: StandardCyborg/StandardCyborgCocoa @.> Cc: Subscribed @.***> Subject: [StandardCyborg/StandardCyborgCocoa] iPhone 14 new true depth camera issues aperture f1.9 (Issue #32)

Results when scanning with an iphone 14 or earlier are very different under the same conditions and environment. In the specifications they say that the iPhone 14 has an opening angle of f1.9, this means that more light enters and you have less depth of field. Does anyone know how to correct this problem? I've tried different AVCapture video_device formats but I still haven't got an answer.

— Reply to this email directly, view it on GitHubhttps://github.com/StandardCyborg/StandardCyborgCocoa/issues/32, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ANS2IX4WFCQKMTAWSXTN2J3W7QLFJANCNFSM6AAAAAAWSVVCMA. You are receiving this because you are subscribed to this thread.Message ID: @.***>

XkRecarens commented 1 year ago

I didn't find any changes on the sensor , im referring on the true depth camera and the results im obtaining scanning with the iPhone 14. Thanks

XkRecarens commented 1 year ago

the depth formats fov is different from devices this may cause the problem... But the scan losses the track super easy, and the scans are very bad. //iPhone 12--Format('dpth'/'fdep' 320x 180, { 2- 30 fps}, photo dims:{}, fov:73.292) //iPhone 14--Format('dpth'/'fdep' 320x 180, { 2- 30 fps}, photo dims:{}, fov:73.699)

saamerm commented 1 year ago

@XkRecarens could you please share some examples? Are you using the iPhone 14 pro max?

aaptho commented 1 year ago

The RGB camera’s FOV and aperture isn’t directly related to the TrueDepth FOV, so I don’t think that particular difference is the issue.

That being said, it is possible that newer iPhone no longer need this scale correction factor that we landed when testing with the iPhone X long ago. https://github.com/StandardCyborg/StandardCyborgSDK/blob/4fa4b04ee29720ef9f7b74217201759418f05583/StandardCyborgFusion/Helpers/PerspectiveCamera%2BAVFoundation.mm#L41

I don’t have an iPhone 14 to test with, but can you try changing that value from 1.019 => 1.00? The proof would be to scan a physical cube with exactly known dimensions and compare. If the SDK consistently yields 3d scans that are larger or smaller than the known cube, that means you should probably tweak the scale correction factor here.

XkRecarens commented 1 year ago

OK i will test it and share some info about the results. Thanks

XkRecarens commented 1 year ago

We have done tests to see the result with different values ​​and it higher the camera.setFocalLengthScaleFactor is, the less it loses tracking, but a scanner with a diameter of 16cm is now 13.5 cm more or less. In any case, the loss of tracking continues to occur often. I share some images that clearly shows the problem. All the scaners are made with same methodology and the same steps. iPhone 12 Pro Max

Captura de pantalla 2023-04-11 a las 16 32 35

iPhone 14 Pro Max with camera.setFocalLengthScaleFactor(1.019f)

Captura de pantalla 2023-04-11 a las 16 37 05

iPhone 14 Pro Max with camera.setFocalLengthScaleFactor(1.15f)

Captura de pantalla 2023-04-11 a las 16 44 30
XkRecarens commented 1 year ago

I show a video with an iPad Pro 11", it happens that the scan at the same point starts to shake, this happens even stronger on the iPhone 14 and the point cloud repositon on the scene is wrong, that creates double meshes and bad results. Is there any way to avoid shaking of the scan? Thank you all

https://user-images.githubusercontent.com/85178688/231438912-867625cd-3cfe-4fb7-a12b-6ebffeffc695.mov

XkRecarens commented 1 year ago

i Also notice this on the reconstructSingleDepthBuffer representation before scanning. a huge blue lines on the left side are representaste on the iPhone 14.

iPhone 12 Pro Max IMG_D85DE1814660-1

iPhone 14 Pro Max with camera.setFocalLengthScaleFactor(1.049f) IMG_45E0301D5939-1

scratchthatitch commented 1 year ago

Have you played around with the depth threshold setting of what is captured? This is what happens when the depth is too short. I can't remember what the property name though. I will dig it out later.


From: XkRecarens @.> Sent: 13 April 2023 10:44 To: StandardCyborg/StandardCyborgCocoa @.> Cc: Joshua Shires @.>; Comment @.> Subject: Re: [StandardCyborg/StandardCyborgCocoa] Poor tracking and scan quality on iPhone 14 (Issue #32)

i Also notice this on the reconstructSingleDepthBuffer representation before scanning. a huge blue lines on the left side are representaste on the iPhone 14.

iPhone 12 Pro Max [IMG_D85DE1814660-1]https://user-images.githubusercontent.com/85178688/231720320-e68da64c-c9b8-43eb-acc4-4b7fb46ab860.jpeg

iPhone 14 Pro Max with camera.setFocalLengthScaleFactor(1.049f) [IMG_45E0301D5939-1]https://user-images.githubusercontent.com/85178688/231721099-0665dc1f-1440-4aef-aa72-e2d251cf21d5.jpeg

— Reply to this email directly, view it on GitHubhttps://github.com/StandardCyborg/StandardCyborgCocoa/issues/32#issuecomment-1506668866, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ANS2IXYGXKYAQW2ITZ4SO2LXA7DJRANCNFSM6AAAAAAWSVVCMA. You are receiving this because you commented.Message ID: @.***>

aaptho commented 1 year ago

The point cloud preview rendering comes from matcap in Assets.xcassets, which is used as a matcap texture. We map the computed normals to texture coordinates within that texture as a simple means of visualization.

I don’t know what the depth values are that they’re mapping to the gray area, but that does seem suspicious. Perhaps iPhone 14 is giving us depth values that we don’t know how to handle. To really dig in, you’d have to look into the raw data frames coming from the sensor, and then see how they’re being handled in MetalDepthProcessor.

Jaykob commented 1 year ago

Hi there! I'm also busy finding out how we can improve the scans for iPhone >=13. Because it wasn't mentioned before: Apple switched the manufacturer from AMS (iPhone X to iPhone 12) to LG and others (more info here: https://www.yolegroup.com/technology-insights/the-second-generation-face-id-is-again-a-marvel-of-technology/).

Unfortunately scan quality seems to have degraded since the switch. The biggest problem that I'm seeing are artefacts, points on the border of an object that seem to be hallucinated with way too high depth values. Take a look at a single frame and it should be obvious. As this is stemming from the raw depth maps, we'd need to do some form of pre-processing / filtering I guess. But that's my current status where I'm stuck. Maybe something like statistical outlier rejection could help.

aaptho commented 1 year ago

Hi there! I'm also busy finding out how we can improve the scans for iPhone >=13. Because it wasn't mentioned before: Apple switched the manufacturer from AMS (iPhone X to iPhone 12) to LG and others (more info here: https://www.yolegroup.com/technology-insights/the-second-generation-face-id-is-again-a-marvel-of-technology/).

Ah, nice find on that supplier change! I also suspect the closer distance between emitter and sensor can yield higher error and shadowing. Pixels that the sensor + software determines are invalid come through as NaN in the raw depth frames, which we do ignore when processing it.

I just published some changes to VisualTesterMac, as well as instructions in the README for how to grab a raw dump of this data.

It may be worth trying the full-resolution depth frames in case they happen to have less error.

aaptho commented 1 year ago

I show a video with an iPad Pro 11", it happens that the scan at the same point starts to shake, this happens even stronger on the iPhone 14 and the point cloud repositon on the scene is wrong, that creates double meshes and bad results. Is there any way to avoid shaking of the scan? Thank you all

RPReplay_Final1681296970.mov

In this case, it’s likely shaking because there’s very little geometric uniqueness for it to track on. The shape of a hand + arm on its edge is very round, and the table is also flat, so it’s harder to rotationally align along that axis. This is a fundamental limitation of our tracking based on depth only, without taking color into account.

aaptho commented 1 year ago

Ok, so it’s possible this tracking instability on iPhone 14 was actually a thread priority inversion issue between the tracking thread and other system threads! In which case, it may have been exacerbated by iPhone 13 and 14 getting faster and therefore having different timing.

This thread priority issue is now fixed with commit 657ed75.

@XkRecarens and others, can you verify if tracking is more stable with this commit? It’s not published to the main StandardCyborgFusion framework, yet, but you can test by checking out the latest main in StandardCyborgSDK, then building and running the TrueDepthFusion app.

I don’t expect it will improve scan accuracy, but it shouldn’t just fall apart anymore if my theory is correct.

aaptho commented 11 months ago

StandardCyborgFusion version 2.3.3 is now released, which includes the fix!

Please comment or re-open if anyone finds further issues with tracking being very unreliable

scratchthatitch commented 11 months ago

Hey,

Awesome news! Fingers crossed.

Will this be automatically updated via cocoa pods if re-compiled and installed?

aaptho commented 11 months ago

Hey,

Awesome news! Fingers crossed.

Will this be automatically updated via cocoa pods if re-compiled and installed?

Yes, just run pod update, which should print out that it updated StandardCyborgFusion to 2.3.3, then rebuild and run your project, and you should be all set!