Closed onybo closed 7 years ago
Function is created and deployment script added. The result is stored in DocumentDB with document id equal to photo file name (must be unique). As there might be multiple faces in each photo, an array of emotions are stored for each photo with a rectangle position marker linking face to emotions.
Example on what's stored in documentDB:
{
"id": "someUniqueFileName.jpg",
"FileName": "someUniqueFileName.jpg",
"Emotions": [
{
"Anger": 0.1954694,
"Contempt": 0.003568499,
"Disgust": 0.0285183731,
"Fear": 0.00753929745,
"Happiness": 0.685596,
"Neutral": 0.0425092354,
"Sadness": 0.00521258637,
"Surprise": 0.0315866247,
"FaceRectangle": {
"Left": 836,
"Top": 448,
"Width": 73,
"Height": 59
}
},
{
"Anger": 0.345953882,
"Contempt": 3.909431E-06,
...
}
],
"_rid": "X1QlAOV3EwACAAAAAAAAAA==",
"_self": "dbs/X1QlAA==/colls/X1QlAOV3EwA=/docs/X1QlAOV3EwACAAAAAAAAAA==/",
"_etag": "\"0000fe2d-0000-0000-0000-58f4c66c0000\"",
"_attachments": "attachments/",
"_ts": 1492436584
}
Azure Function-App that monitors a blob-storage container When an image is uploaded to the container the function uploads it to the Emotion-API and stores the result in DocumentDB All Emotion API values should be stored "anger", "contempt "disgust "fear", "happiness", "neutral", "sadness", "surprise"