FirebaseExtended / mlkit-material-ios

These apps demonstrate how to build an end-to-end user experience with Google ML Kit APIs and following the new Material for ML design guidelines.
Apache License 2.0
81 stars 18 forks source link

Using own Cloud Vision Product Search #9

Closed m-atlantis closed 5 years ago

m-atlantis commented 5 years ago

Hey, I've been playing around with this app and want to link it to my Cloud Vision Product Set to search through, but I'm unsure how I get the values I need to replace (APIKey, productSearchURL, acceptType). A resource file is mentioned but no such thing is available through the CV Console, and looking through the API guides provided me with no solution to my problem. Hope someone can help.

Thanks in advance

miworking commented 5 years ago

Hi Mathias, You can simply override the return value of APIKey, productSearchURL, and acceptType in mlkit-material-ios/ShowcaseApp/ShowcaseApp/Models/FIRProductSearchRequest.m image

Or if you don’t want to expose your keys in the code, you can prepare a key.plist with the following content and add it to the workspace, the code should be able to read the keys from it automatically:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>X-IVS-APIKey</key>
    <string><--------APIKEY---------></string>
    <key>Accept</key>
    <string><--------accept-type---></string>
    <key>URL</key>
    <string><--------URL-----------></string>
</dict>
</plist>

Please let me know if it works for you.

Cheers!

m-atlantis commented 5 years ago

Hey again, I tried changing those, my problem is I'm unsure how to obtain the three values to use. I only got as far as error code 400 using the following URL: https://vision.googleapis.com/v1/images APIKey: Generated on credentials page in the cloud console AcceptType: application/json

I tried with several other different URL's like https://vision.googleapis.com/v1/images:annotate?key= with and without the key inserted after the equal sign. I also tried https://vision.googleapis.com/v1p4beta1 - yet I'm still unsure if I've got the correct URL. But the key.plist is rather neat, I'll be using that for sure!

EDIT I tried curling https://vision.googleapis.com/v1/images:annotate which is mentioned to be the endpoint for requests in this docs https://cloud.google.com/vision/docs/request, but this returns a 404. Curling https://vision.googleapis.com/v1/projects/my-project-id/locations/europe-west1/productSets/myProductSet?key=MyKey returns a correct JSON response body. Using this URL in the code does not work though.

The error-code is:

  "error": {
    "code": 400,
    "message": "Invalid JSON payload received. Unknown name \"content\": Cannot find field.\nInvalid JSON payload received. Unknown name \"RU\": Cannot find field.",
    "status": "INVALID_ARGUMENT",
    "details": [
      {
        "@type": "type.googleapis.com/google.rpc.BadRequest",
        "fieldViolations": [
          {
            "description": "Invalid JSON payload received. Unknown name \"content\": Cannot find field."
          },
          {
            "description": "Invalid JSON payload received. Unknown name \"RU\": Cannot find field."
          }
        ]
      }
    ]
  }
}

Thanks!

zhouyiself commented 5 years ago

Hi Mathias, Looks like you're following the general cloud vision API tutorial, while product search is not part of that and has its own guidance. https://cloud.google.com/vision/product-search/docs/

m-atlantis commented 5 years ago

Hey, I followed this tutorial as well, I have been accessing the product search successfully, giving me the right results given an image, using a python Flask application with my service account token, the API used there is https://vision.googleapis.com/v1p4beta1/images:annotate, but I can't get any results through this API just by replacing those three values (APIKey, URL and AcceptType) in the iOS app. At first I thought it was these three values that were the problem, but I believe the app is not creating JSON of the format expected by the endpoint.

EDIT I finally got the JSON constructed correct, I was right in the app not creating correct JSON. Now it contacts the Product Search. It says "0 search results" in the app, but no more API errors.

miworking commented 5 years ago

Hi Mathias,

You are right that the APIKey, URL and AcceptType are not enough to send a well-formed request to the vision product search backend. It was tailored to one of our testing backend. I will update this part of code in the weekend to accommodate the vision product search. Stay tuned.

Thanks for reporting the bug!

Julie

miworking commented 5 years ago

Hi Mathias, A quick update: we were able to get a product image search result via curl request like this,

curl -X POST \
-H 'Content-Type: application/json' \
https://vision.googleapis.com/v1/images:annotate?key=<APIKey> -d '{
  'requests': [
    {
      'image': {
        'source': {
          'gcsImageUri': 'gs://cloud-ai-vision-data/product-search-tutorial/images/46a0cbcf70ba11e89399d20059124800.jpg'
        }
      },
      'features': [
        {
          'type': 'PRODUCT_SEARCH',
          'maxResults': 30
        }
      ],
      'imageContext': {
        'productSearchParams': {
          'productSet': 'projects/<project-id>/locations/<location-id>/productSets/product_set0',
          'productCategories': [
               'apparel'
          ],
        }
      }
    }
  ]
}'

however, still getting 400 error in the Objective C Code. I think you might have worked around this JSON body construction problem, probably you can compare the parameters to see where can be updated from your side.

From our side, we are still trying to construct a valid JSON request body. Will let you know how it goes. Also very welcome any insights from your side.

Thanks a lot!

m-atlantis commented 5 years ago

Hey, I create some quick solution to the JSON, which looks like so (NOTE: Also had to change the way the products are structured from the JSON response, to get results to show up in the app)

/** **** Image **** */
NSMutableDictionary *encodedImage = [[NSMutableDictionary alloc] init];
[encodedImage setObject:encodedString forKey:@"content"];

/** **** Features **** */
NSMutableDictionary *featuresInner = [[NSMutableDictionary alloc] init];
[featuresInner setObject:@(5) forKey:@"maxResults"];
[featuresInner setObject:@"PRODUCT_SEARCH" forKey:@"type"];

NSMutableArray *features = [[NSMutableArray alloc] init];
[features addObject:featuresInner];

/** **** ImageContext **** */
NSMutableArray *productCategories = [[NSMutableArray alloc] init];
[productCategories addObject:@"homegoods-v2"];

NSMutableDictionary *productSearchParams = [[NSMutableDictionary alloc] init];
[productSearchParams setObject:productCategories forKey:@"productCategories"];
[productSearchParams setObject:@"projects/PROJECT_ID/locations/LOCATION_ID/productSets/PRODUCT_SET_ID" forKey:@"product_set"];
/** [productSearchParamsInner setObject:emptyDict forKey:@"bounding_poly"]; */

NSMutableDictionary *imageContext = [[NSMutableDictionary alloc] init];
[imageContext setObject:productSearchParams forKey:@"productSearchParams"];

/** **** Requests Array Body **** */
NSMutableDictionary *arrayInner = [[NSMutableDictionary alloc] init];
[arrayInner setObject:encodedImage forKey:@"image"];
[arrayInner setObject:features forKey:@"features"];
[arrayInner setObject:imageContext forKey:@"imageContext"];

NSMutableArray *array = [[NSMutableArray alloc] init];
[array addObject:arrayInner];

NSMutableDictionary *JSONBody = [NSMutableDictionary dictionary];
JSONBody[@"requests"] = array;     

In my Key.plist file, I have APIKey = "MY_APIKey", URL = "https://vision.googleapis.com/v1/images:annotate?key=MY_APIKey" and Accept = "application/json".

But I'm still having troubles with the images. I'm unsure how to use the given image-reference to show the image in the app, providing service account information to create a signed-url seems to be the way so I will work towards that now, if you have any tips I'd appreciate it! :) (This is my first time ever writing Objective C, so that part is a challenge in itself)

Thanks!

miworking commented 5 years ago

Thanks a lot for the information Mathias! I will compare it with mine.

About the images: I'm unsure how to use the given image-reference to show the image in the app, providing service account information to create a signed-url seems to be the way so I will work towards that now

Are you talking about the original cropped image sent for query or the images urls that you get in the response as search results? The former one has already been handled in this showcase app in FIRVideoCamViewController.m, while for the latter one, the code to fetch the images hasn’t been covered yet. So if you are able to get the response successfully, you may need to initiate another round of requests to the server fetching the images.

One more thing related to image that needs to mention is that if I use the encoded string generated in the app, curl command wouldn’t work, while if I use a random one from the internet, it works. So there might be a bug in the image encoding part. Still need some time to debug that.

The latest one that worked for me was:

curl -X POST \
-H "Content-Type: application/json" \
https://vision.googleapis.com/v1/images:annotate?key=<API Key> -d "{
  'requests': [
    {
      'image': {
        'content': 'R0lGODlhPQBEAPeoAJosM//AwO/AwHVYZ/z595kzAP/s7P+goOXMv8+fhw/v739/f+8PD98fH/8mJl+fn/9ZWb8/PzWlwv///6wWGbImAPgTEMImIN9gUFCEm/gDALULDN8PAD6atYdCTX9gUNKlj8wZAKUsAOzZz+UMAOsJAP/Z2ccMDA8PD/95eX5NWvsJCOVNQPtfX/8zM8+QePLl38MGBr8JCP+zs9myn/8GBqwpAP/GxgwJCPny78lzYLgjAJ8vAP9fX/+MjMUcAN8zM/9wcM8ZGcATEL+QePdZWf/29uc/P9cmJu9MTDImIN+/r7+/vz8/P8VNQGNugV8AAF9fX8swMNgTAFlDOICAgPNSUnNWSMQ5MBAQEJE3QPIGAM9AQMqGcG9vb6MhJsEdGM8vLx8fH98AANIWAMuQeL8fABkTEPPQ0OM5OSYdGFl5jo+Pj/+pqcsTE78wMFNGQLYmID4dGPvd3UBAQJmTkP+8vH9QUK+vr8ZWSHpzcJMmILdwcLOGcHRQUHxwcK9PT9DQ0O/v70w5MLypoG8wKOuwsP/g4P/Q0IcwKEswKMl8aJ9fX2xjdOtGRs/Pz+Dg4GImIP8gIH0sKEAwKKmTiKZ8aB/f39Wsl+LFt8dgUE9PT5x5aHBwcP+AgP+WltdgYMyZfyywz78AAAAAAAD///8AAP9mZv///wAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACH5BAEAAKgALAAAAAA9AEQAAAj/AFEJHEiwoMGDCBMqXMiwocAbBww4nEhxoYkUpzJGrMixogkfGUNqlNixJEIDB0SqHGmyJSojM1bKZOmyop0gM3Oe2liTISKMOoPy7GnwY9CjIYcSRYm0aVKSLmE6nfq05QycVLPuhDrxBlCtYJUqNAq2bNWEBj6ZXRuyxZyDRtqwnXvkhACDV+euTeJm1Ki7A73qNWtFiF+/gA95Gly2CJLDhwEHMOUAAuOpLYDEgBxZ4GRTlC1fDnpkM+fOqD6DDj1aZpITp0dtGCDhr+fVuCu3zlg49ijaokTZTo27uG7Gjn2P+hI8+PDPERoUB318bWbfAJ5sUNFcuGRTYUqV/3ogfXp1rWlMc6awJjiAAd2fm4ogXjz56aypOoIde4OE5u/F9x199dlXnnGiHZWEYbGpsAEA3QXYnHwEFliKAgswgJ8LPeiUXGwedCAKABACCN+EA1pYIIYaFlcDhytd51sGAJbo3onOpajiihlO92KHGaUXGwWjUBChjSPiWJuOO/LYIm4v1tXfE6J4gCSJEZ7YgRYUNrkji9P55sF/ogxw5ZkSqIDaZBV6aSGYq/lGZplndkckZ98xoICbTcIJGQAZcNmdmUc210hs35nCyJ58fgmIKX5RQGOZowxaZwYA+JaoKQwswGijBV4C6SiTUmpphMspJx9unX4KaimjDv9aaXOEBteBqmuuxgEHoLX6Kqx+yXqqBANsgCtit4FWQAEkrNbpq7HSOmtwag5w57GrmlJBASEU18ADjUYb3ADTinIttsgSB1oJFfA63bduimuqKB1keqwUhoCSK374wbujvOSu4QG6UvxBRydcpKsav++Ca6G8A6Pr1x2kVMyHwsVxUALDq/krnrhPSOzXG1lUTIoffqGR7Goi2MAxbv6O2kEG56I7CSlRsEFKFVyovDJoIRTg7sugNRDGqCJzJgcKE0ywc0ELm6KBCCJo8DIPFeCWNGcyqNFE06ToAfV0HBRgxsvLThHn1oddQMrXj5DyAQgjEHSAJMWZwS3HPxT/QMbabI/iBCliMLEJKX2EEkomBAUCxRi42VDADxyTYDVogV+wSChqmKxEKCDAYFDFj4OmwbY7bDGdBhtrnTQYOigeChUmc1K3QTnAUfEgGFgAWt88hKA6aCRIXhxnQ1yg3BCayK44EWdkUQcBByEQChFXfCB776aQsG0BIlQgQgE8qO26X1h8cEUep8ngRBnOy74E9QgRgEAC8SvOfQkh7FDBDmS43PmGoIiKUUEGkMEC/PJHgxw0xH74yx/3XnaYRJgMB8obxQW6kL9QYEJ0FIFgByfIL7/IQAlvQwEpnAC7DtLNJCKUoO/w45c44GwCXiAFB/OXAATQryUxdN4LfFiwgjCNYg+kYMIEFkCKDs6PKAIJouyGWMS1FSKJOMRB/BoIxYJIUXFUxNwoIkEKPAgCBZSQHQ1A2EWDfDEUVLyADj5AChSIQW6gu10bE/JG2VnCZGfo4R4d0sdQoBAHhPjhIB94v/wRoRKQWGRHgrhGSQJxCS+0pCZbEhAAOw=='
      },
      'features': [
        {
          'type': 'PRODUCT_SEARCH',
          'maxResults': 30
        }
      ],
      'imageContext': {
        'productSearchParams': {
          'productSet': 'projects/<project_id>/locations/<location_id>/productSets/product_set0',
          'productCategories': [
               'apparel'
          ],
        }
      }
    }
  ]
}"
m-atlantis commented 5 years ago

I am talking about the image-reference for the product returned, I've been following this guide to try to create a signed URL for the images, but I am having trouble finding a way to RSA - SHA256 encrypt the "string-to-sign" in Objective C. I am assuming I need to sign it using the private-key of my service account. Once I get that working, I should have a full functional product search app. If you have any insights on how to do this, I'm all ears!

miworking commented 5 years ago

I see, it is good to know that you've already successfully come to the next step.

It's probably not the best thread to address that issue as it is about accessing the image URL returned by vision API, which is out of the scope of this project. As for this project, we were mainly trying to provide an example of material design for different machine learning patterns.

But we do have a plan to write a blog about the integration between vision API and MLKit object detection & tracking on August, providing all the details, example code from the beginning to the end, but for now, we don't have the bandwidth for it, sorry for that.

About the authentication issue from Objective C, I would suggest you to get a support from this link, they would raise a ticket and get a dedicated expert to help you on the case. Hope you will get that best help from there!

m-atlantis commented 5 years ago

Sounds great, Thanks for the help!