apple / coremltools

Core ML tools contain supporting tools for Core ML model conversion, editing, and validation.
https://coremltools.readme.io
BSD 3-Clause "New" or "Revised" License
4.39k stars 633 forks source link

ML model doesn't load on background thread #611

Open bhavin250495 opened 4 years ago

bhavin250495 commented 4 years ago

🐞Describe the bug

Trace

2020-01-31 12:17:10.135653+0530 ** [3505:875445] [coreml] Error in adding network -1. 2020-01-31 12:17:10.139437+0530 ** [3505:875445] [coreml] MLModelAsset: load failed with error Error Domain=com.apple.CoreML Code=0 "Error in declaring network." UserInfo={NSLocalizedDescription=Error in declaring network.} 2020-01-31 12:17:10.140146+0530 ** [3505:875445] [coreml] MLModelAsset: modelWithError: load failed with error Error Domain=com.apple.CoreML Code=0 "Error in declaring network." UserInfo={NSLocalizedDescription=Error in declaring network.} Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=com.apple.CoreML Code=0 "Error in declaring network." UserInfo={NSLocalizedDescription=Error in declaring network.}: file ML.swift, line 105

To Reproduce

System environment (please complete the following information):

anilkatti commented 4 years ago

@bhavin250495 could you share the code that you are using to run the prediction in the background? One thing to keep in mind is background mode is enabled only on CPU. You can do this by setting MLModelConfiguration.computeUnits.

bhavin250495 commented 4 years ago

@anilkatti Thanks for replying, I tried adding MLPredcitionOptions but i don't have any problem with prediction, the issue is with loading of ML model from local URL

  1. I trigger a silent notification from the server to start inference on app
  2. When app gets silent notification, it does some preprocessing and creates input for MLmodel
  3. Now when i am trying to run prediction on MLmodel when the app is in background it gets crashed or just take forever to load model
 func predict(input:MLInput) throws -> Bool{
        print("Loading model")
//Model does't load here when in background
        if let model = SmartML(){
            let options = MLPredictionOptions.init()
            options.usesCPUOnly = true
            let outputLabel = try nudgeModel.prediction(input: input:MLInput,options: options).classLabel

            let status = (outputLabel == "yes") ? true : false
            return status
        }
        return false
    }
anilkatti commented 4 years ago

My suggestion was to use SmartML(configuration:) method to load the model but, I did not realize that you were running this on MacOS. Based on the error, it seem like the app is having issue reading the model files from disk in the background mode. Could you share a repro case so, I can debug the root-cause? The model itself looks good.

bhavin250495 commented 4 years ago

I am using this model on iOS , and if i share the repo to reproduce the error you will need to setup silent notifications . Could you run the inference using local timer in background to reproduce the error

hfnvbh commented 4 years ago

Hi all, Is there any updates about this issue? I met same problem, when coreml model loaded in background mode after silent notification in my ios app, I have same message "Error in declaring network."

When I inited model in foreground, or in BGProcessing mode, it's ok.

anilkatti commented 4 years ago

Silent notifications are not meant as a way to keep your app awake in the background beyond quick refresh operations, nor are they meant for high priority updates.

I am looking for clear guidance in this scenario. I will update this tread.