RudySantana / google-api-python-client

Automatically exported from code.google.com/p/google-api-python-client
0 stars 0 forks source link

Check Response is below memcache limit for App Engine #224

Closed GoogleCodeExporter closed 8 years ago

GoogleCodeExporter commented 8 years ago
I had submitted a large job to BigQuery that had an error in it. Upon checking 
for the status of the job, I believe BigQuery returned to me an error along 
with the original job payload I had sent over. This payload was about 3MB, and 
upon recieving the response, the API client tried to cache it. I believe a 
check should be put in place for the memcache limit in GAE:

  File "/base/data/home/apps/s~<app-name-removed>/<backend-name-removed>.363613723921559936/oauth2client/util.py", line 120, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "/base/data/home/apps/s~<app-name-removed>/<backend-name-removed>.363613723921559936/apiclient/http.py", line 676, in execute
    headers=self.headers)
  File "/base/data/home/apps/s~<app-name-removed>/<backend-name-removed>.363613723921559936/oauth2client/util.py", line 120, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "/base/data/home/apps/s~<app-name-removed>/<backend-name-removed>.363613723921559936/oauth2client/client.py", line 420, in new_request
    redirections, connection_type)
  File "/base/data/home/apps/s~<app-name-removed>/<backend-name-removed>.363613723921559936/httplib2/__init__.py", line 1588, in request
    (response, content) = self._request(conn, authority, uri, request_uri, method, body, headers, redirections, cachekey)
  File "/base/data/home/apps/s~<app-name-removed>/<backend-name-removed>.363613723921559936/httplib2/__init__.py", line 1394, in _request
    _updateCache(headers, response, content, self.cache, cachekey)
  File "/base/data/home/apps/s~<app-name-removed>/<backend-name-removed>.363613723921559936/httplib2/__init__.py", line 438, in _updateCache
    cache.set(cachekey, text)
  File "/base/python27_runtime/python27_lib/versions/1/google/appengine/api/memcache/__init__.py", line 793, in set
    namespace=namespace)
  File "/base/python27_runtime/python27_lib/versions/1/google/appengine/api/memcache/__init__.py", line 898, in _set_with_policy
    time, '', namespace)
  File "/base/python27_runtime/python27_lib/versions/1/google/appengine/api/memcache/__init__.py", line 977, in _set_multi_async_with_policy
    stored_value, flags = _validate_encode_value(value, self._do_pickle)
  File "/base/python27_runtime/python27_lib/versions/1/google/appengine/api/memcache/__init__.py", line 236, in _validate_encode_value
    'received %d bytes' % (MAX_VALUE_SIZE, len(stored_value)))
ValueError: Values may not be more than 1000000 bytes in length; received 
3097357 bytes

Original issue reported on code.google.com by someone1@gmail.com on 5 Dec 2012 at 7:49

GoogleCodeExporter commented 8 years ago
[deleted comment]
GoogleCodeExporter commented 8 years ago
Suggested a patch for httplib2 that would solve this issue:
http://code.google.com/p/httplib2/issues/detail?id=238

Original comment by someone1@gmail.com on 5 Dec 2012 at 8:13

GoogleCodeExporter commented 8 years ago
From that issue on httplib2, the suggested fix is to implement the screening of 
values on size using a simple wrapper of memcache:

from google.appengine.api.memcache

class SizeLimitedMemCache(object):
  def set(key, value, time=0, min_compress_len=0, namespace=None):
    if len(value) < 1000000:
      return memcache.set(key, value, time, min_compress_len, namespace)
    else:
      return False

  def get(key, namespace=None, for_cas=False):
    return memecache.get(key, namespace, for_cas)

  def delete(key, seconds=0, namespace=None):
    return memcache.delete(key, seconds, namespace)

Original comment by jcgregorio@google.com on 7 Dec 2012 at 7:27

GoogleCodeExporter commented 8 years ago
Setting maxResult on getQueryResults also avoids the issue.

Original comment by caioigle...@gmail.com on 6 Jun 2014 at 8:33