terricain / aioboto3

Wrapper to use boto3 resources with the aiobotocore async backend
Apache License 2.0
732 stars 75 forks source link

'coroutine' object has no attribute 'batch_writer' #206

Closed marcioemiranda closed 3 years ago

marcioemiranda commented 4 years ago

Description

I've been using aioboto3 for some time (release 7.0.0). Yesterday I tried the latest release and had the problem described bellow:

'coroutine' object has no attribute 'batch_writer'

I use batch write to load data from s3 to dynamo. Going back to version 7.0.0 solved the problem.

What I Did

Snippets of the code:

run(main(event, context))

async def main(event, context):
...
async with aioboto3_resource('dynamodb') as dynamo_resource:
            table = dynamo_resource.Table(environ['TABLE_NAME'])
            await s3_to_dynamo(event, context, shard_index, s3obj, table)
...

async def s3_to_dynamo (event, context, shard_index, s3obj, table):
...
await gather(*(dynamoBatchWrite(batch, table, segmentId) for batch in limitBatchArr))
...

async def dynamoBatchWrite(batch, table, segmentId, count=0):
       **async with table.batch_writer() as batch_writer:**
            for item in batch:
                await batch_writer.put_item(
                    Item=item
                )   

Here is the stack trace of the error:

{
  "error": "AttributeError",
  "cause": {
    "errorMessage": "'coroutine' object has no attribute 'batch_writer'",
    "errorType": "AttributeError",
    "stackTrace": [
      "  File \"/opt/python/aws_lambda_powertools/tracing/tracer.py\", line 266, in decorate\n    response = lambda_handler(event, context)\n",
      "  File \"/opt/python/aws_lambda_powertools/logging/logger.py\", line 442, in decorate\n    return lambda_handler(event, context)\n",
      "  File \"/var/task/loadSegmentChunk/lambda_function.py\", line 231, in lambda_handler\n    run(main(event, context))\n",
      "  File \"/var/lang/lib/python3.8/asyncio/runners.py\", line 43, in run\n    return loop.run_until_complete(main)\n",
      "  File \"/var/lang/lib/python3.8/asyncio/base_events.py\", line 616, in run_until_complete\n    return future.result()\n",
      "  File \"/var/task/loadSegmentChunk/lambda_function.py\", line 201, in main\n    await s3_to_dynamo(event, context, shard_index, s3obj, table)\n",
      "  File \"/var/task/loadSegmentChunk/lambda_function.py\", line 150, in s3_to_dynamo\n    await gather(*(dynamoBatchWrite(batch, table, segmentId) for batch in limitBatchArr))\n",
      "  File \"/var/task/loadSegmentChunk/lambda_function.py\", line 48, in dynamoBatchWrite\n    async with table.batch_writer() as batch_writer:\n"
    ]
  }
}
terricain commented 4 years ago

Since version 8 you'll need to do await resource.Table()