thedodd / wither

An ODM for MongoDB built on the official MongoDB Rust driver.
https://docs.rs/wither
Other
324 stars 40 forks source link

wither::Migration extensions #70

Open c650 opened 3 years ago

c650 commented 3 years ago

I see that via an IntervalMigration I can set/unset fields in documents that pass a filter. Is there yet a way to transform data in a field from one format to another?

I think this may get tricky due to the fact that the change to the struct would cause a deserialisation error on the old version. If there's a good solution I'd be happy to work on it.

thedodd commented 3 years ago

Hmm, we may be able to implement a StreamingMigration which when executed will perform some static query, will pass a cursor to a boxed callback declared on the migration (declared by the user), and that callback can just do whatever it would like to, updating documents, transforming data &c.

Thoughts?

c650 commented 3 years ago

that sounds cool. how can we solve the problem of our struct definitions changing? We could just do it all in bson but that feels messy, error-prone

thedodd commented 3 years ago

how can we solve the problem of our struct definitions changing?

Hmm, do you mind elaborating on your question a bit?

c650 commented 3 years ago
#[derive(Model)]
struct User {
  pub x: Vec<Inner>
}

now say i want to change user to

#[derive(Model)]
struct User {
  pub x: Vec<AnotherInner>
}

i need to write some code to transform this in a migration, and i also need to keep Inner around till im sure all the data is migrated.

thedodd commented 3 years ago

I see. Yea, they should be able to just exist as part of the model's migrations() response data. So the user can create a migration inside of that method with a box function, and when the migration is executed it can perform the transformation on the data and write it back out to the collection.

You would only really need to keep Inner around if your migration needs to deserialize that inner data in some way. Otherwise you could just deal with it as raw BSON. We could leave that to the user though. To better facilitate this situation, it seems logical that the StreamingMigration type (or whatever we call it) should deal with raw BSON documents, instead of attempting to serialize and deserialize data for the user, specifically to account for these sorts of situations. Thoughts?

c650 commented 3 years ago

it seems logical that the StreamingMigration type (or whatever we call it) should deal with raw BSON documents, instead of attempting to serialize and deserialize data for the user, specifically to account for these sorts of situations. Thoughts?

this sounds good. my only concern is the user having to deal with a lot of bson boilerplate like parsing some field as some type and then unwrap()ing it and so on