long2506 / google-api-php-client

Automatically exported from code.google.com/p/google-api-php-client
Apache License 2.0
0 stars 0 forks source link

Support FusionTables importRows functionality #192

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
1. authenticate using oauth 2 and a service account
2. connect to the fusiontables service
3. create an sql statement to insert into an existing table where the sql is > 
2k.  This can be a single insert or multiple inserts chained with a ;
4. run the sql through fusionTables->query->sql

What is the expected output? What do you see instead?

If the sql is valid and meets the API limitations (<500 inserts per request, < 
1MB data, <10000 cells) then the data should be inserted into the table.  
Instead, there is an exception from the server along these lines: 

<p><b>414.</b> <ins>That’s an error.</ins>
  <p>The requested URL <code>/fusiontables/v1/query</code>... is too large to process.  <ins>That’s all we know.</ins>

hacking the code to provide an alternate implementation of sql() that puts the 
request in the post body generates an error on the server that the required 
parameter sql is missing.

The documentation of the service API seems to indicate that INSERT should use 
POST and place the sql statements in the post body, but it seems this is 
unsupported by both the google-api-php-client and the fusion tables v1 api at 
the moment.

What version of the product are you using? On what operating system?

Please provide any additional information below.

Original issue reported on code.google.com by pagameba on 11 Sep 2012 at 7:13

GoogleCodeExporter commented 9 years ago
I've worked more on this and I've managed to figure out what is going on.  The 
Fusion Tables query api allows you to pass the sql= in the post body which 
avoids the URI limitations but the code in Google_ServiceResource always uses a 
content-type of application/json for POST when postBody is set, but the query 
api actually only works if you use application/x-www-form-urlencoded as the 
content-type.

I'm not sure where the best place to fix this is.  I've made a couple of hacks 
to work around the problem:

1. added an sqlPost method to the QueryService definition (definitely not 
discoverable) which removes the sql parameter as a required parameter

2. the sqlPost implementation in the service sets the post body to sql=$sql

3. the Google_ServiceResource __call method now checks to see if the first 
character in the postBody is { or [ and sets the content type to 
application/json if it is, otherwise application/x-www-form-urlencoded

Hopefully this makes sense, please let me know if you want more details on my 
work-around.

Original comment by pagameba on 12 Sep 2012 at 3:02

GoogleCodeExporter commented 9 years ago
[deleted comment]
GoogleCodeExporter commented 9 years ago
Pagameba, 

it seems I encountered the same issue, the API seems to still use GET, while it 
should use POST. 

Can you provide your changed files so I can check them out as well. I had some 
difficulties removing the 'sql' parameter as a required parameter. You changed 
that at Google_FusiontablesService line 604?

Any support from Google might be usefull as well, as filling the Fusiontables 
does not really work like this ;)

Cheers

Original comment by menno.ho...@gmail.com on 15 Oct 2012 at 3:26

GoogleCodeExporter commented 9 years ago
Ok, my import is finally running, the changes made to have the HTTP POST work 
properly for individual and multiple query statements (tested up to 700kb).

1. Replace the query propery in the constructor of Google_FusiontablesService 
(line 603) to (removes the sql parameter from the request and response):

$this->query = new Google_QueryServiceResource($this, $this->serviceName, 
'query', json_decode('{"methods": {"sqlPost": {"scopes": 
["https://www.googleapis.com/auth/fusiontables", 
"https://www.googleapis.com/auth/fusiontables.readonly"], "parameters": 
{"typed": {"type": "boolean", "location": "query"}, "hdrs": {"type": "boolean", 
"location": "query"}}, "id": "fusiontables.query.sqlPost", "httpMethod": 
"POST", "path": "query", "response": {"$ref": "Sqlresponse"}}, "sql": 
{"scopes": ["https://www.googleapis.com/auth/fusiontables", 
"https://www.googleapis.com/auth/fusiontables.readonly"], "parameters": 
{"typed": {"type": "boolean", "location": "query"}, "hdrs": {"type": "boolean", 
"location": "query"}}, "id": "fusiontables.query.sql", "httpMethod": "POST", 
"path": "query", "response": {"$ref": "Sqlresponse"}}}}', true));

2. Add sqlPost function, which caries the sql in the postBody
public function sqlPost($sql, $optParams = array()) {
    $params = array('postBody' => "sql=$sql");
    $params = array_merge($params, $optParams);
    $data = $this->__call('sqlPost', array($params));
    if ($this->useObjects()) {
        return new Google_Sqlresponse($data);
    } else {
        return $data;
    }
}

3. in Google_ServiceResource, line 161, in case no JSON request is detected, we 
can safely set the application/x-www-form-urlencoded content-type

//In case no indication of JSON strings are found, we can assume an actual POST
if(strpos($postBody, '{') ===  false && strpos($postBody, '[') === false){
    $contentTypeHeader['content-type'] = 'application/x-www-form-urlencoded; charset=UTF-8';
}
else{
    $contentTypeHeader['content-type'] = 'application/json; charset=UTF-8';
}

Original comment by menno.ho...@gmail.com on 15 Oct 2012 at 4:09

GoogleCodeExporter commented 9 years ago
Hi, sorry for the delay in responding - looks like you figured it out :) Your 
changes look similar to mine.

Original comment by pagameba on 18 Oct 2012 at 7:16

GoogleCodeExporter commented 9 years ago
Hi,

It would be wonderful if the google-api-php-client could handle the 
throttling/size limitations related to Fusion Tables Service in a transparent 
way.
i.e: adding the row INSERT statements in the first step, then sending the 
request(s) - depending on POST body size and number of inserts - in a later 
step. 
IMHO there's no point in using GET uri for INSERT statements as it introduces 
another limitation to address, I'd always use the POST method.

Thanks,
//S 

Original comment by racz.san...@gmail.com on 4 Dec 2012 at 2:31

GoogleCodeExporter commented 9 years ago
In case someone looking for ways to insert more than 500 records in one step: 
importing a csv is the way to go. Limitations:  More details: 
https://developers.google.com/fusiontables/docs/v1/reference/table/importRows

Since this is not supported by google-php-api-client as per today, I made some 
modifications to leverage the power of this newly added feature. All 
modifications are valid for the 0.6.0 version.

Add new service classes to src/contrib/Google_FusiontablesService.php:

class Google_FusiontablesUploadService extends Google_Service {
  public $import;
  public function __construct(Google_Client $client) {
    $this->servicePath = 'upload/fusiontables/v1/';
    $this->version = 'v1';
    $this->serviceName = 'fusiontables';

    $client->addService($this->serviceName, $this->version);
    $this->import = new Google_TableUploadServiceResource($this, $this->serviceName, 'table', json_decode('{
        "methods":{
           "import":{
              "scopes":[
                 "https://www.googleapis.com/auth/fusiontables"
              ],
              "path":"tables/{tableId}/import",
              "id":"fusiontables.table.import",
              "parameters":{
                 "tableId":{
                    "required":true,
                    "type":"string",
                    "location":"path"
                 }
              },
              "response":{
                 "$ref":"TableImportResponse"
              },
              "httpMethod":"POST",
              "requestContentTypeOverride":"application/octet-stream"
           }
        }
     }', true));
  }
}
  class Google_TableUploadServiceResource extends Google_ServiceResource {
    /**
     * Import data into a table. (table.import)
     *
     * @param string $csv
     * @param array $optParams Optional parameters.
     * @return Google_TableImportResult
     */
    public function import($tableId, $csv, $optParams = array()) {
      $params = array('tableId' => $tableId,'postBody' => $csv);
      $params = array_merge($params, $optParams);
      $data = $this->__call('import', array($params));
      if ($this->useObjects()) {
        return new Google_TableImportResult($data);
      } else {
        return $data;
      }
    }
  }
class Google_TableImportResult extends Google_Model {
  public $kind;
  public $numRowsReceived;
  public function getKind() {
      return $this->kind;
  }

  public function setKind($kind) {
      $this->kind = $kind;
  }

  public function getNumRowsReceived() {
      return $this->numRowsReceived;
  }

  public function setNumRowsReceived($numRowsReceived) {
      $this->numRowsReceived = $numRowsReceived;
  }
}

Replace line 142 in src/service/Google_ServiceResource.php to handle 
"application/octet-stream" Content-Type of POST:
    $contentType = isset($method['requestContentTypeOverride']) ? $method['requestContentTypeOverride'] : false;

Usage:
        $uploadService = new Google_FusiontablesUploadService($client);
        $uploadServiceResource = $uploadService->import;
        /* @var $uploadServiceResource Google_TableUploadServiceResource */
        $response = $uploadServiceResource->import($tableId, $csv);

$csv : string containing your actual csv content. This will be placed in the 
POST body. 

I hope it helps someone.

Cheers,
//S

Original comment by racz.san...@gmail.com on 5 Dec 2012 at 2:09

GoogleCodeExporter commented 9 years ago
ImportsRow with fusion table work great thanks !

Original comment by jfr.bouc...@gmail.com on 25 Jan 2013 at 2:19

GoogleCodeExporter commented 9 years ago
menno and pagameba,
Thanks for your work on this.  I ran into the same problem (not knowing what 
the real issue was) and broke my queries down to 1 at a time - very slow.  This 
weekend I needed to have a much larger query (over 1260 characters before any 
encoding) and hit the problem again.  I had posted earlier and the only 
response indicated I needed to give more info.

I the client library files have a date of 10-8.  My first attempt had a delete 
which worked but my insert which was long failed with a error:
====
Fatal error: Uncaught exception 'Google_ServiceException' with message 'Error 
calling POST https://www.googleapis.com/fusiontables/v1/query: (403) Forbidden' 
in /var/www/htalias-cs/FUSION/google-api-php-client/src/io/Google_REST.php:66 
Stack trace: #0 
/var/www/htalias-cs/FUSION/google-api-php-client/src/io/Google_REST.php(36): 
Google_REST::decodeHttpResponse(Object(Google_HttpRequest)) #1 
/var/www/htalias-cs/FUSION/google-api-php-client/src/service/Google_ServiceResou
rce.php(183): Google_REST::execute(Object(Google_HttpRequest)) #2 
/var/www/htalias-cs/FUSION/google-api-php-client/src/contrib/Google_Fusiontables
Service.php(173): Google_ServiceResource->__call('sqlPost', Array) #3 
/var/www/htalias-cs/FUSION/UpdateFusion.php(103): 
Google_QueryServiceResource->sqlPost('INSERT INTO 1uk...') #4 {main} thrown in 
/var/www/htalias-cs/FUSION/google-api-php-client/src/io/Google_REST.php on line 
66
====
I'll have to do some more digging.  If either of you have seen this before and 
know how to fix it let me know.

It really seems that the library should be updated to deal with this issue 
(unless I have an outdated library).

Thanks again for your analysis of the problem.
johnh...

Original comment by john.hay...@my.wheaton.edu on 18 Feb 2013 at 8:31

GoogleCodeExporter commented 9 years ago
Just a short follow up.
1) I had created a new table with a different id so that's why got forbidden 
(would be nice to have a message no such table id).
2) Using the sqlPost method for some reason I don't seem to be able to get utf 
(Japanese) characters to show up - it may well be related to how I am doing the 
processing of the data.

Original comment by john.hay...@my.wheaton.edu on 20 Feb 2013 at 2:43

GoogleCodeExporter commented 9 years ago
Another followup.
I'm still having problems using the sqlPost to get utf-8 characters to show up.

I made some changes to the mysqli enviornment I was getting some of the data 
from and modifed a header to allow php pages to display the data which was 
being sent - the sql with the Japanese characters show up fine when displayed 
on the web page but only the ascii characters (not the utf-8 Japanese 
characters) show up in the table.

I'm pretty using the sql before the patch was put in did result in utf-8 
Japanese characters - one difference is the table was built using the API where 
the other table was built via the GUI.

johnh...

Original comment by john.c.h...@gtempaccount.com on 24 Feb 2013 at 10:49

GoogleCodeExporter commented 9 years ago
John, 2 things:

1: I don't know if the query API supports utf8 encoding(it should have..), but 
the importRows method does. Moreover the encoding filed in the request defaults 
to utf8. More on the subject: 
https://developers.google.com/fusiontables/docs/v1/reference/table/importRows . 
If you really like to stick with the query API debug your request and check the 
Content-Type header. It should be Content-Type: text/html; charset=utf-8

2: Conversations like this should take place over at stackoverflow.com . You'd 
get much larger audience and you wouldn't spam the feature tracker 

Original comment by racz.san...@gmail.com on 25 Feb 2013 at 9:14

GoogleCodeExporter commented 9 years ago
Updated title to reflect what seems to be the major ask. 

Original comment by ianbar...@google.com on 22 Mar 2013 at 3:38

GoogleCodeExporter commented 9 years ago
Updated FusionTables service in trunk:
https://code.google.com/p/google-api-php-client/source/detail?r=545

It would be great if someone can verify that it's working now.

Original comment by silv...@google.com on 28 Mar 2013 at 11:49

GoogleCodeExporter commented 9 years ago
I wonder if this is really fixed, as I can't seem to find out how to import CSV 
using the latest version (ie. how do you feed CSV to the api?) unless we 
implement ourselves the changes proposed above by racz.

Similarly, inserting multiple (> 50) doesn't seem to work either (I'm getting 
Google_ServiceException unless I use small number of rows)

Either way, it would be great to have this feature fixed, as in its current 
state the library makes it very hard to use it within API quota values.

Original comment by David.Ca...@gmail.com on 30 May 2013 at 2:08

GoogleCodeExporter commented 9 years ago
re-opening

Original comment by silv...@google.com on 30 May 2013 at 3:03

GoogleCodeExporter commented 9 years ago
The patch on 
http://code.google.com/p/google-api-php-client/issues/detail?id=192#hc4 works, 
but the svn source still fails.

Original comment by federwe...@gmail.com on 20 Jun 2013 at 3:32

GoogleCodeExporter commented 9 years ago
Good to know. I opted to implement racz's patch as explained in #7. Even though 
it initially worked well, today it's been a complete failure, always getting 
503 "Backend error" for every attempt (I don't blame the patch, but some 
possible server issue today Sunday)

BTW, is it correct to assume there are some issues using "import" on a existing 
non-empty table? Should it be used on bran new tables only? Or can you 
otherwise append data?

I got back to using single INSERTs, but of course this is far from ideal, 
taking into account that some of the data I need to pass contains dozens or 
hundreds of rows (very slow and also consumes quota easily)

In the past, I also tried using several INSERTs in a row, but this doesn't work 
either, as it seems the PHP library passes this through a GET instead of a POST.

Any further hint?

Original comment by David.Ca...@gmail.com on 30 Jun 2013 at 2:44

GoogleCodeExporter commented 9 years ago
Hello,

 I am trying to use racz's patch as explained in #7, and cannot get the CSV File to upload. I have downloaded an already working Fusion Tables CSV file and am attempting to pass it to the function. I have tried both the local and weburl for the file without success.

Error:
(400) Content has a different number of columns than the table (line 1).

Has anyone had anymore luck getting CSV Files to upload?

Original comment by mumford....@gmail.com on 16 Jul 2013 at 8:51

GoogleCodeExporter commented 9 years ago
The Google Fusion Tables php API appears to contain "importRows" by default, 
but there is no documentation around it's usage. This could probably be closed 
off if someone gave usage examples.

Original comment by m...@jgrah.am on 26 Jul 2013 at 1:31

GoogleCodeExporter commented 9 years ago
Can you guys please provide a sample? It's really a pain to try and figure out 
how this supposed to work. I've tried all sorts of combinations. It looks like 
other API calls that provide a postBody of some sort have a specific instance 
e.g. 

public function insert($projectId, $datasetId, Google_Table $postBody, 
$optParams = array()) {

So I'm not sure if this is relevant and one of the handlers pick the type up 
and parse it somehow!?

I just want to pass in the CSV body and import those rows.

I get this error:
Google_ServiceException: Error calling POST 
https://www.googleapis.com/fusiontables/v1/tables/XXXX-KEjs/import?key=XXXX: 
(503) Backend Error in Google_REST::decodeHttpResponse() (line 66 of 
/Users/xxxx/google-api-php-client/src/io/Google_REST.php).

Also, what is odd, based on the API documentation shouldn't the POST occur on a 
different URL with 'upload' included e.g.:
POST https://www.googleapis.com/upload/fusiontables/v1/tables/tableId/import

As per Fusion Tables Docs: 
https://developers.google.com/fusiontables/docs/v1/reference/table/importRows

Original comment by jer...@jnpwebdeveloper.com on 1 Aug 2013 at 3:58

GoogleCodeExporter commented 9 years ago
Just to say, it might be worth pinging the Fusion Tables team with regards to a 
PHP sample - we don't generally implement new client samples from the client 
library perspective, and I'm afraid they're unlikely to check this issue 
tracker. 

Original comment by ianbar...@google.com on 1 Aug 2013 at 4:01

GoogleCodeExporter commented 9 years ago
[deleted comment]
GoogleCodeExporter commented 9 years ago
Very poor documentation on how to use the FT API with PHP. If someone managed 
to create working project, please, share with us your sample.

Original comment by hoss...@gmail.com on 30 Aug 2013 at 4:35

GoogleCodeExporter commented 9 years ago
For anyone else looking out there, I found this code and fix very useful for 
finally getting an easy solution with the google-api-php-client.  
https://groups.google.com/forum/#!msg/google-api-php-client/9d2lQAppTvg/vh8CRhWJ
b_YJ

Original comment by logan.t....@gmail.com on 8 Dec 2013 at 4:30

GoogleCodeExporter commented 9 years ago
This issue tracker is now closing. Development on the Google PHP client library 
moved to GitHub with the release of the 1.0.0-alpha, and now the 1.0 branch has 
reached beta status there will be no further releases of the 0.6 branch of the 
library. 

Please take a look at the latest version on 
https://github.com/google/google-api-php-client

For information on migrating, please take a look at this guide: 
https://developers.google.com/api-client-library/php/guide/migration

For general library support please ask a question on StackOverflow: 
http://stackoverflow.com/questions/tagged/google-api-php-client

If you are looking for support with a specific API, please contact the team 
working with that API via StackOverflow or their preferred support mechanism. 

If your issue still exists with the new version of the library, please raise a 
bug in the GitHub issue tracker with a minimal code sample. 

Thanks!

Original comment by ianbar...@google.com on 22 Jan 2014 at 4:53