Closed jplock closed 11 months ago
@jplock Hey, happy to support that. Seems like the implementation would be similar to the batch writing that's already there.
I wonder if automatically flushing at 100 items might cause issues with expected behavior if the operations are broken up into multiple transactions? One solution is a auto_flush: bool = False
option in the context manager function.
For your use case, would you need this to support multiple tables in the same transaction?
auto flush makes sense. For my use case I’m using transactions to write objects to two tables (one for data and one acting as an index)
Actually, single table should be fine to start for my use case. I forgot I’m using the dynamo stream to populate the other table.
I'll try to get an alpha version out sometime this week to get your feedback.
Another nuance with how I’m currently using Transactions is to create a transaction writer at the beginning of a request then add various put, update, delete and conditions that all get committed at once. So basically a way to begin a transaction in a context manager outside of one of the models (I’m using a single table design).
@jplock Give 0.11.0a1
a go, let me know your feedback. Sample usage:
from dyntastic import transaction
with transaction():
item1 = SomeTable(...)
item2 = AnotherTable.get(...)
item1.save()
item2.update(A.something.set("..."))
Can use transaction(auto_commit=True)
to automatically submit the transaction and begin a new one once it reaches 100 items.
There isn't any handling for failed transactions e.g. conditions failing for saves/updates/deletes, or item.transaction_condition(...)
. boto3 will just raise a TransactionCanceledException
. Let me know if you'd want a way to ignore certain exceptions like that (might be tricky since lots of things can cause that particular exception).
It also currently does not commit changes when an exception is raised upon exiting the transaction()
context manager, let me know if it'd be helpful to have an option to disable that safety feature.
Awesome, thank you, will test this out
This is a wonderful addition, it is actually the only item stopping us to migrate to dynastic
, @nayaverdier is there any roadmap or timeline to release 0.11
as non-alpha release? :pray:
@photonbit From my testing this should be ready for release. I'm not sure if @jplock had a chance to try it out yet, have you checked that it works for your use cases? If so I can go ahead and just release 0.11.
I'm trying to incorporate this into my project. I'm using field aliases on my pydantic models and then doing this:
default_version.model_dump(mode="json", by_alias=True, exclude_none=True)
to get an Item to pass to DynamoDB (after running through the serializer). I have to figure out a similar approach with this framework.
As far as I know that should work right away, as the Dyntastic
class inherits from pydantic.BaseModel
@jplock The current behavior when saving a model is to dump it with by_alias=True, and null fields are always removed given DynamoDB's limitations, so that should be out of the box with .save(). What's the reason you're dumping using mode="json"
?
Edit: Are you using pydantic v2? If so, that is not supported yet by dyntastic but it is on the my roadmap.
Yes I'm using pydantic v2. I'm fairly new to pydantic, so if this isn't necessary, I'd love to hear it, but I've got two datetime
fields in my model that need to be converted to ISO8061 strings when dumped and not left as datetime
objects since DynamoDB doesn't know how to store those. Using "json" fixed the issue for me.
@jplock What is your dyntastic usage with pydantic v2? They don't play nicely together yet so curious to see how you're using it.
The way Dyntastic handles that case is similarly by doing a json dump, but if you're using Dyntastic you shouldn't have to think about that.
@nayaverdier I have a question about the auto_flush
and the 100 items limit. Is that number tied to some limit in dynamodb? If it is not, maybe it would be better to make it a parameter? If that is okay I can open a PR with this and some docs about the transaction mechanism.
@photonbit Yep, it is tied to a DynamoDB limitation, see here:
TransactWriteItems is a synchronous and idempotent write operation that groups up to 100 write actions in a single all-or-nothing operation.
Still doesn't hurt to make it a parameter, and just default to 100. If they ever increase that limit then, anybody on older dyntastic versions can override it. Documentation for transaction would be very appreciated as well.
Feel free to release a new release whenever with this support. Unfortunately I won't be able to properly evaluate the transaction support until pydantic v2 is supported. Would you like an issue created for that?
Sure, go ahead and create an issue for that to track. Thanks!
https://github.com/nayaverdier/dyntastic/pull/11 also related with this issue
Released in 0.11.0
.
Thank you!
First I wanted to say thank you for creating this library. I've been looking for a way to represent DynamoDB "models" using something like Pydantic but didn't want a full ORM like Pynamo. We're having a discussion about it on https://github.com/aws-powertools/powertools-lambda-python/issues/2053.
Any thoughts on supporting
TransactWriteItems
andTransactGetItems
? It would be cool to use a context manager with writing a transaction to automatically flush it once it reaches 100 items.