HyperPandaz / Serverless

GNU General Public License v3.0
0 stars 0 forks source link

Emotionalgifs #19

Closed HyperPandaz closed 3 years ago

ghost commented 3 years ago

Week 2 Step 3 ⬤⬤⬤◯◯◯◯◯ | 🕐 Estimated completion: 10-30 minutes

Getting Emotional ~ With the Face API

Remember back in step 1 we used Postman to call the API? Well we are going to call the Face API with our Azure Function!

✅ Task:

Create an Azure function that can send a static image to the Face API that returns emotion data

🚧 Test your Work

Use Postman! Paste the function url and make a POST request. Remember to attach the file in Body! In the output box, you should get the output. Make sure you're using an image with a real face on it or else it won't work. Here's an example of an output I get with this image:

:white_check_mark: Expected Output The output should be in JSON format like so: ```json { "result": [ { "faceId": "a16f522d-0577-4e50-97cb-1feeef7aaf2e", "faceRectangle": { "top": 313, "left": 210, "width": 594, "height": 594 }, "faceAttributes": { "emotion": { "anger": 0, "contempt": 0, "disgust": 0, "fear": 0, "happiness": 1, "neutral": 0, "sadness": 0, "surprise": 0 } } } ] } ```

1. Define keys to authenticate the request

This function takes in one parameter, img, that contains the image we're trying to analyze. Inside, we have two variables involved in the call: subscriptionKey and uriBase.

The process.env object allows you to access super-secret values in your backend. This prevents hackers from getting your keys and doing bad stuff (like exploiting your resources).

async function analyzeImage(img){
    const subscriptionKey = process.env.SUBSCRIPTIONKEY;
    const uriBase = process.env.ENDPOINT + '/face/v1.0/detect';
}

Follow this Documentation to save your keys in your Function App's application settings.

💡 When you put the key and endpoint in Azure's application settings, it will only work after it is deployed and running with the Azure function url. In order to use postman and test this on your computer before deploying to Azure, you will need to code the subscriptionKey and uriBase like this:

const subscriptionKey = "YOUR_SUBSCRIPTIONKEY"
const uriBase = "YOUR_LOCALHOST_URL"

💡 Afterwards, when commiting the code to github for the counselorbot to check, you will need to replace the hardcoded subscriptionKey and uriBase variables with process.env.SUBSCRIPTIONKEY and process.env.ENDPOINT

❓ Why do we need to do this?
When running your program on your own computer, the program has no way of accessing Microsoft Azure's function application settings, so `process.env` will not work. However, once you deploy the function onto Azure, the function now can "see" the application settings, and can use them through `process.env`

2: Call the FACE API

Create a new function analyzeImage(img) outside of module.exports that will handle analyzing the image. Keep in mind this function is async because we will be using the await keyword with the API call.

Get your Face API keys ready! We need to let our Face API know that we are authenticated to access this resource.

❓ What is an async function?
Javascript is what we call a "synchronous" language, meaning operations in Javascript block other operations from executing until they are complete, creating a sense of single directional flow. **This means that only one operation can happen at a time.** However, in order to maximize efficiency (save time and resources), Javascript allows the use of asynchronous functions. Async functions allow other operations to continue running as they are being executed. Refer to [this blog](https://dev.to/hardy613/asynchronous-vs-synchronous-programming-23ed) for more information. Promises are sychnronous objects, similar to their real life meaning, return a value at some point in the future, or a reason for why that value could not be returned - they represent the result of an async function that may or may not be resolved. > [Is JavaScript Synchronous or Asynchronous? What the Heck is a Promise?](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise) > [Master the JavaScript Interview: What is a Promise?](https://medium.com/better-programming/is-javascript-synchronous-or-asynchronous-what-the-hell-is-a-promise-7aa9dd8f3bfb)
❗️ Specify the parameters for your request!
In order to specify all of our parameters easily, we're going to create a new `URLSearchParams` object. Here's the object declared for you. I've also already specified one parameter, `returnFaceId`, as `true` to provide an example. Add in a new parameter that requests emotion. Remember, these parameters are coming from the [Face API documentation](https://westus.dev.cognitive.microsoft.com/docs/services/563879b61984550e40cbbe8d/operations/563879b61984550f30395236)! Screen Shot 2021-05-16 at 9 20 19 PM ```js let params = new URLSearchParams({ 'returnFaceId': 'true', '': '' //FILL IN THIS LINE }) ```

:hammer_and_wrench: Calling the API

In the same way we installed parse-multipart, install node-fetch. Read the API section of the documentation. We're going to make a call using the fetch(url, {options}) function.

To call the fetch function - use the await keyword, which we need because fetch returns a Promise, which is a proxy for a value that isn't currently known. You can read about Javascript promises here.

API Documentation can be tricky sometimes...Here's something to help

:bulb: Request Headers tell the receiving end of the request what type of data is in the body.

❓ How do I specify Request Headers?
Go back to the Face API documentation [here](https://westus.dev.cognitive.microsoft.com/docs/services/563879b61984550e40cbbe8d/operations/563879b61984550f30395236), and find the **Request headers** section. There are two headers that you need. I've provided the format below. Enter in the two header names and their two corresponding values. FYI: The `Content-Type` header should be set to`'application/octet-stream'`. This specifies a binary file. ```js //COMPLETE THE CODE let resp = await fetch(uriBase + '?' + params.toString(), { method: '', //WHAT TYPE OF REQUEST? body: '', //WHAT ARE WE SENDING TO THE API? //ADD YOUR TWO HEADERS HERE headers: { '
': '
' } }) ```
❓ What is the URL?
Notice that the URL is just the uriBase with the params we specified earlier appended on. `const uriBase = '' + '/face/v1.0/detect';`
❓ Still Confused? Fill in the `method` and `body`. ```js async function analyzeImage(img){ const subscriptionKey = ''; const uriBase = '' + '/face/v1.0/detect'; let params = new URLSearchParams({ 'returnFaceId': 'true', 'returnFaceAttributes': 'emotion' }) //COMPLETE THE CODE let resp = await fetch(uriBase + '?' + params.toString(), { method: '', //WHAT TYPE OF REQUEST? body: '', //WHAT ARE WE SENDING TO THE API? headers: { '
': '
' //do this in the next section } }) let data = await resp.json(); return data; } ```

⏬ Receiving Data

Call the analyzeImage function in module.exports. Add the code below into module.exports.

Remember that parts represents the parsed multipart form data. It is an array of parts, each one described by a filename, a type and a data. Since we only sent one file, it is stored in index 0, and we want the data property to access the binary file– hence parts[0].data. Then in the HTTP response of our Azure function, we store the result of the API call.

//module.exports function
//analyze the image
var result = await analyzeImage(parts[0].data);
context.res = {
    body: {
        result
    }
};
console.log(result)
context.done(); 
ghost commented 3 years ago

Week 2 Step 4 ⬤⬤⬤⬤◯◯◯◯ | 🕐 Estimated completion: 5-10 minutes

Getting Emotional ~ Returning the Dominant Emotion

✅ Task:

Modify your Azure Function so that it returns the Dominant Emotion of an Image.

1. Finding the Dominant Emotion

In order to match the results of the Face API with Gifs from the Giphy API, we need to determine the dominant emotion from the API response.

:hammer_and_wrench: Modifying the Azure Function

We need to access the emotion data by itself, without the face id and other analyzed data. To do this, we need to create another variable in the first async function in our Azure Function:

let emotions = result[0].faceAttributes.emotion;

:bulb: Now you've got the JSON object with all the emotion values, find the highest valued emotion! Use context.log(emotions) to see how it's structured.

We're accessing the data at index 0 because we're analyzing one face. If there were two faces, the data for the second face would be stored at index 1.

❓ How do I find the max value from the JSON object?
1️⃣ We need to create **an array** with the emotion values (ranging from 0 to 1) so that we can manipulate it and find the dominant emotion. `Object.values()` converts an object into an array, with each **value** in the object stored as a separate element: ```js let objects = Object.values(WHAT_IS_YOUR_JSON); // FILL IT IN // What your array could look like: [0.01, 0.34, .....] ``` > :bulb: In JSON, the **key** values are what you use to access the **value**. `{key: value}`, or in our case, `{emotion: value}`. Finally, we need to find the dominant emotion in the array `objects`: 2️⃣ Let's break this line down. ```js const main_emotion = Object.keys(emotions).find(key => emotions[key] === Math.max(...objects)); ``` - `Math.max(...objects)` finds the max value. Let's say it's `0.99`. - `Object.keys(emotions).find(key => emotions[key] === Math.max(...objects));` finds the emotion, or key, that matches the max value of `0.99`. Let's say it's `happiness`. 3️⃣ Now, `main_emotion` contains the dominant emotion! All we need to do is output `main_emotion` when the function is called: ```js context.res = { // status: 200, /* Defaults to 200 */ body: main_emotion }; ```