dokun1 / Lumina

A camera designed in Swift for easily integrating CoreML models - as well as image streaming, QR/Barcode detection, and many other features
https://david.okun.io
MIT License
899 stars 91 forks source link

`streamed` not being called #85

Closed SamuelMarks closed 6 years ago

SamuelMarks commented 6 years ago

Describe the bug A clear and concise description of what the bug is.

To Reproduce Check console. Never get "streamed" message. Never get any text on the screen (unless I uncomment the "testing" one.

import UIKit
import AVFoundation
import Lumina

class MainVC: UIViewController {
    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)

        LuminaViewController.loggingLevel = .verbose

        let camera = LuminaViewController()
        camera.delegate = self

        if #available(iOS 11.0, *) {
            camera.streamFrames = true
            camera.textPrompt = ""
            camera.trackMetadata = true
            camera.streamingModels = [MobileNet(), Inceptionv3()]
        } else {
            print("Warning: this iOS version doesn't support CoreML")
        }

        // camera.textPrompt = "testing"
        present(camera, animated: true, completion: nil)
    }
}

extension MainVC: LuminaDelegate {
    func dismissed(controller: LuminaViewController) {
        controller.dismiss(animated: true, completion: nil)
    }

    func streamed(videoFrame: UIImage, with predictions: [LuminaRecognitionResult]?, from controller: LuminaViewController) {
        print("streamed")
        if #available(iOS 11.0, *) {
            guard let predicted = predictions else {
                return
            }
            var resultString = String()
            for prediction in predicted {
                guard let values = prediction.predictions else {
                    continue
                }
                guard let bestPrediction = values.first else {
                    continue
                }
                resultString.append("\(String(describing: prediction.type)): \(bestPrediction.name)" + "\r\n")
            }
            controller.textPrompt = resultString
        } else {
            print("Warning: this iOS version doesn't support CoreML")
        }
    }
}

Expected behavior Followed YouTube video. It should show predictions on screen. Currently it shows no text, and doesn't debug to console that it even hit that function. dismissed works though.

Screenshots If applicable, add screenshots to help explain your problem.

Smartphone (please complete the following information):

Additional context Add any other context about the problem here.

dokun1 commented 6 years ago

Hi @SamuelMarks - thanks for opening the issue!

I think this means I need to re-record my video, as the interface for this changed in v1.3.0. Can you try to use the readme, which has been updated, to make your project work? The main line of code will change to:

camera.streamingModels = [LuminaModel(model: MobileNet().model, type: "MobileNet")]

You can also check the sample app for how this is done. Let me know if this works for you - if it does, then I definitely have to update the video.

SamuelMarks commented 6 years ago

Thanks, changing to this worked:

if #available(iOS 11.0, *) {
    camera.streamingModels = [
        LuminaModel(model: MobileNet().model, type: "MobileNet"),
        LuminaModel(model: Inceptionv3().model, type: "Inceptionv3")
    ]
} else {
    print("Warning: this iOS version doesn't support CoreML")
}