tensorflow / models

Models and examples built with TensorFlow
Other
77.15k stars 45.77k forks source link

deeplab tflite and mobilenet ssd cannot run serially #8971

Open WindowsDriver opened 4 years ago

WindowsDriver commented 4 years ago

Prerequisites

Please answer the following questions for yourself before submitting an issue.

1. The entire URL of the file you are using

mobilenet ssd model http://download.tensorflow.org/models/object_detection/ssdlite_mobiledet_cpu_320x320_coco_2020_05_19.tar.gz deeplab model http://download.tensorflow.org/models/deeplabv3_mnv2_pascal_train_aug_2018_01_29.tar.gz

2. Describe the bug

Run Deeplab tflite model run successfully Run mobilenet ssd model success separately Run two models serially in a loop, for example: first run the SSD model, check the picture, close the model; then run the Deeplab model, check the picture, close the model; run the SSD model, check, close, and run the Deeplab model..... The program will stop at the place where the Deeplab model is initialized. Sometimes it appears in one hour, sometimes it appears in 5 hours; this stop does not report an error or throw an exception. A clear and concise description of what the bug is.

3. Steps to reproduce

  1. Convert the SSD and Deeplab models to tflite and put them on the android device.
  2. Initialize the SSD mobilenet model, perform target detection, and close the model
  3. Initialize the Deeplab model, perform segmentation, and close the model
  4. The second and third steps run alternately. At some point, a bug will appear, and the time of occurrence is uncertain. Steps to reproduce the behavior.

4. Expected behavior

It can run normally without stopping; or throw an exception when there is no result for a long time.

5. Additional context

no error log, It just stops at the place where the model is initialized, no errors are reported or exceptions are thrown, and the program does not stop, just like entering an infinite loop, completely unresponsive

6. System information

android : OS:Android 6.0 compileSdkVersion 25

defaultConfig {
    minSdkVersion 21
    targetSdkVersion 21
    versionCode 1
    versionName "1.0"

    testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
    ndk {
        abiFilters 'armeabi-v7a'
    }
}
WindowsDriver commented 4 years ago

The problem submitted above sometimes appears immediately, sometimes it takes a long time to appear. However, if you initialize two models at once (mobinenet V2 SSD+deeplab), the deeplab model initialization will always be paused, the initialization interface will not report an error or throw an exception, and the program will not continue.

WindowsDriver commented 4 years ago
private List<String> loadLabelList(String labelFilePath) throws Exception {
    List<String> labels =  new ArrayList<>();
    InputStream labelsInput = null;
    BufferedReader bufferedReader = null;
    try{
        bufferedReader = new BufferedReader(new FileReader(labelFilePath));
        String line;
        while ((line = bufferedReader.readLine()) != null) {
            labels.add(line);
        }

        bufferedReader.close();
    }catch (Exception e){
        throw  e;
    }finally {
        if(labelsInput != null){
            try {
                labelsInput.close();
            } catch (IOException e) {
                throw  e;
            }
        }
    }
    return labels;
}

private MappedByteBuffer loadModelFile(String modelFilePath) throws Exception { MappedByteBuffer mbb = null; FileInputStream inputStream = null; try{ inputStream = new FileInputStream(modelFilePath); FileChannel fileChannel = inputStream.getChannel(); long declaredLength = fileChannel.size(); mbb = fileChannel.map(FileChannel.MapMode.READ_ONLY, 0, declaredLength); }catch (Exception e){ throw e; }finally { if(inputStream != null){ try { inputStream.close(); } catch (IOException e) { throw e; } } } return mbb; }

public boolean createInterpreter() throws Exception{ boolean isCreated = false; labelsList = loadLabelList(modelFileInfo.getLabelsFileName()); modelFileName = modelFileInfo.getModelFileName(); MappedByteBuffer tfLiteModel = loadModelFile(modelFileInfo.getModelFileName());

    Interpreter.Options tfLiteOptions = new Interpreter.Options();
    if(modelFileInfo.isUseGPU()){
        if (gpuDelegate == null) {
            GpuDelegate gpuDelegate = new GpuDelegate();
            tfLiteOptions.addDelegate(gpuDelegate);
        }
    }
    tfLiteInterpreter = new Interpreter(tfLiteModel, tfLiteOptions);
    tfLiteModel.clear();

    if(tfLiteInterpreter != null){
        initDataDefinition();
        isCreated = true;
    }
    isDetectorCreated = isCreated;
    return  isCreated;
}
YknZhu commented 4 years ago

Does this issue happen to deeplab model only (while running with SSD)?

WindowsDriver commented 4 years ago

Does this issue happen to deeplab model only (while running with SSD)?

yes。

WindowsDriver commented 4 years ago

bug : Run Deeplab tflite model run successfully Run mobilenet ssd model success separately Run two models serially in a loop, for example: first run the SSD model, check the picture, close the model; then run the Deeplab model, check the picture, close the model; run the SSD model, check, close, and run the Deeplab model..... The program will stop at the place where the Deeplab model is initialized. Sometimes it appears in one hour, sometimes it appears in 5 hours; this stop does not report an error or throw an exception. A clear and concise description of what the bug is.

WindowsDriver commented 4 years ago

Does this issue happen to deeplab model only (while running with SSD)?

yes, Could you help me to reply this question? Thank you