microsoft / Cognitive-Face-iOS

iOS SDK for the Microsoft Face API, part of Cognitive Services
https://www.microsoft.com/cognitive-services/en-us/face-api
Other
184 stars 64 forks source link

How to detect emotion using Microsoft Face API iOS client library? #43

Closed adincebic closed 6 years ago

adincebic commented 6 years ago

Hello,

I would like to use Microsoft cognitive services in my iOS app. Currently I am able to do some simple stuff with it, but I can not find the way to detect emotion in the photo.

Here is what I can do now:

let client = MPOFaceServiceClient.init(endpointAndSubscriptionKey: "https://westcentralus.api.cognitive.microsoft.com/face/v1.0/", key: "MY_KEY")

func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) { guard let image = info[UIImagePickerControllerOriginalImage] as? UIImage else {return} var data = Data() data = UIImageJPEGRepresentation(image, 0.7)! client?.detect(with: data, returnFaceId: true, returnFaceLandmarks: true, returnFaceAttributes: nil) { (result, error) in if let error = error { print(error.localizedDescription) } else { guard let result = result else {return} for face in result { print(face.faceId) let emotion = MPOFaceEmotion()

            }
        }
    }
}

The sample project is not that useful and I find out that it is very difficult to get any documentation for the iOS client library.

huxuan commented 6 years ago

Hi @adincebic,

We also have a sample app for reference. Detection related code can be found here If you have any specific questions, please feel free to reopen this issue or create a new one.