Closed FoldedLaundry closed 5 years ago
@FoldedLaundry can you attach a more complete stack trace to this crash? You can get one from Xcode using the command bt all
in the debugger.
Filed internally as b/135037674
@morganchen12 I will get it for you on Friday.
@morganchen12
I ran the bt all
command. It listed stacktraces of many threads. I decided to copy/paste only the one that crashed. Let me know if this is good enough.
I had to sanitize the output. XXXXXXX is the name of our internal framework (Cocoa Touch Framework embedded in our project). Everything Firebase-related is linked only against this framework (Messaging, Analytics, MLKit). Wrappers are exposed outside the module.
* thread #18, queue = 'com.google.firebaseml.textrecognition', stop reason = EXC_BREAKPOINT (code=1, subcode=0x1abe8eda4)
* frame #0: 0x00000001abe8eda4 libsystem_malloc.dylib`nanov2_allocate_from_block$VARIANT$armv81 + 528
frame #1: 0x00000001abe8e03c libsystem_malloc.dylib`nanov2_allocate$VARIANT$armv81 + 140
frame #2: 0x00000001abe8df60 libsystem_malloc.dylib`nanov2_malloc$VARIANT$armv81 + 60
frame #3: 0x00000001abe9b99c libsystem_malloc.dylib`malloc_zone_malloc + 156
frame #4: 0x00000001abe9c3ac libsystem_malloc.dylib`malloc + 32
frame #5: 0x00000001ab47462c libc++abi.dylib`operator new(unsigned long) + 44
frame #6: 0x000000010a13d878 XXXXXXX`___lldb_unnamed_symbol1466$$XXXXXXX + 84
frame #7: 0x000000010a4bee28 XXXXXXX`___lldb_unnamed_symbol14587$$XXXXXXX + 592
frame #8: 0x000000010a68e954 XXXXXXX`___lldb_unnamed_symbol21148$$XXXXXXX + 408
frame #9: 0x000000010a689f5c XXXXXXX`___lldb_unnamed_symbol21088$$XXXXXXX + 10104
frame #10: 0x000000010a5233e0 XXXXXXX`___lldb_unnamed_symbol15567$$XXXXXXX + 648
frame #11: 0x000000010a523c40 XXXXXXX`___lldb_unnamed_symbol15572$$XXXXXXX + 576
frame #12: 0x000000010a113d00 XXXXXXX`___lldb_unnamed_symbol799$$XXXXXXX + 316
frame #13: 0x000000010a114ad8 XXXXXXX`___lldb_unnamed_symbol801$$XXXXXXX + 476
frame #14: 0x000000010a0c8c48 XXXXXXX`-[FIRVisionTextRecognizer onDeviceTextRecognitionInImage:beginTime:completion:] + 288
frame #15: 0x000000010a0c7e60 XXXXXXX`__51-[FIRVisionTextRecognizer processImage:completion:]_block_invoke + 624
frame #16: 0x0000000111937840 libdispatch.dylib`_dispatch_call_block_and_release + 24
frame #17: 0x0000000111938de4 libdispatch.dylib`_dispatch_client_callout + 16
frame #18: 0x0000000111940e88 libdispatch.dylib`_dispatch_lane_serial_drain + 720
frame #19: 0x0000000111941b7c libdispatch.dylib`_dispatch_lane_invoke + 460
frame #20: 0x000000011194bc18 libdispatch.dylib`_dispatch_workloop_worker_thread + 1220
frame #21: 0x00000001abeda0f0 libsystem_pthread.dylib`_pthread_wqthread + 312
frame #22: 0x00000001abedcd00 libsystem_pthread.dylib`start_wqthread + 4
In order to (as much as possible) patch this issue, at least temporarily, I decided to give CIFilter(s) a go. Here are some results.
Converting the image to grayscale with CIFilter(name: "CIPhotoEffectNoir")
alleviates the issue a great deal. The sample 'bad' image I provided doesn't crash MLKit after applying the filter. Still, I managed to crash the app, by capturing similar frames (using the same area of the document), but much less often. Then, I went on and applied CIFilter(name: "CIColorControls")
with ...setValue(NSNumber(value: 1.5), forKey: "inputContrast")
. This has helped even more. However, I am still trying to figure out an even better workaround. E. g. I can still easily crash MLKit by capturing a 'perfect' white'n'gray image by using a text editor and scanning my computer monitor (2019 15" MBP).
Here are the filters we have resorted to. They alleviate the issue pretty well, and allow us to scan both printed text as well as computer LCDs, however, if you really want to (and you know how to) you can still crash the thing.
CIFilter(name: "CIPhotoEffectMono")
CIFilter(name: "CIGammaAdjust")
...setValue(NSNumber(value: 2.15), forKey: "inputPower")
CIFilter(name: "CIColorControls")
...setValue(NSNumber(value: 1.2), forKey: "inputContrast")
...setValue(NSNumber(value: 0.0), forKey: "inputBrightness")
Also, we crop the image before applying filters (we take about 10-15% area from 1080x1920), so performance isn't bad either.
Hi @FoldedLaundry, I tried to reproduce your issue using our quickstart. It worked fine. I have a suspect that the real cause of the issue is a race condition in threads and/or while you are cropping, accessing the image buffer etc. Please give the same images a try with the quickstart and re-open the issue if it persists.
@ulukaya I have been unable to reproduce the issue within the quickstart project, however, I don't think your suspicion is correct. The code I originally provided with this issue is literally what it takes to reproduce the crash. I have written a unit test inside our project. The only thing it does is it calls Firebase code, and it still crashes. There's no camera capture, cropping, or filtering involved.
// This class exists outside the test.
class CustomVisionImage: VisionImage {
override init(image: UIImage) {
super.init(image: image)
metadata = VisionImageMetadata()
metadata?.orientation = .topLeft
}
}
// Test
let filePath = Bundle.main.path(forResource: "GoogleService-Info", ofType: "plist")!
let options = FirebaseOptions(contentsOfFile: filePath)!
FirebaseApp.configure(options: options)
let exp = defaultExpectation()
let recognizer = Vision.vision().onDeviceTextRecognizer()
let visionImage = CustomVisionImage(image: faultyImage)
recognizer.process(visionImage) { (resultOrNil, _) in
// This closure is never called.
print(resultOrNil?.text ?? "Nope.")
exp.fulfill()
}
This shouldn't be relevant, but in our project we don't use cocoapods. I've heard of rare cases where an external dependency behaves differently when imported with cocoapods, but this has to be verified.
@FoldedLaundry can you share a fully symbolicated stack trace? Also, if you're able to share a sample project that reproduces this issue, please do so as well.
@morganchen12 Take a look at this. 'Good image' works fine. 'Bad image' - as you can probably guess - crashes the app. Its improved version - with contrast bumped manually, outside the project - also works. To my surprise, this sample project reproduces the issue even though it uses cocoapods.
You need to provide your own services plist, and run pod install.
@morganchen12 Was my sample project of any use?
@ryanwilson @ulukaya @morganchen12 Guys, any luck? :)
Sorry for the slow response, @FoldedLaundry, much of the team was out on vacation last week.
We've handed the bug off to the ML Kit team and will update here when they have a fix.
Hi @FoldedLaundry, we tried the images you provided against the quickstart app and were unable to reproduce the issue.
Closing since I wasn't able to reproduce this issue with the latest ML Kit pods.
Step 2: Describe your environment
Step 3: Describe the problem
There are images that cause MLKit to crash (heap corruption, free list is damaged). We capture live feed with AVFoundation, occasionally take one image, cropp it and subject it to text recognition (as UIImage)
Steps to reproduce:
Scan light-colored text printed against light-colored background (like gray text over white background). Clipping the text (best: 30% from the bottom) helps reproduce the issue.
I am attaching two images to this issue. One of them works 100% fine and the other one always crashes the library.
Relevant Code: