Closed josejachuf closed 2 years ago
@josejachuf would be nice but my question is: what happens when you're updating more than 1 record? Would you have multiple rows for the old values? It means weppy has to select an arbitrary number of records before the update? What if the resultset corresponds to 1k records, it would be selected in bulk?
@gi0baro It may be very costly. I was not looking like that, I was thinking of something similar to triggers.
@josejachuf tell me more. Who/what triggers the trigger? And yet, I have no idea on how to distinguish between the fact you're updating one record or more than one. Because for pydal you're always updating an arbitrary number N of records, it's an atomic operation. Having the changes means is not atomic anymore.
@gi0baro I was thinking about database triggers. But you're right, I had not seen it that way
@josejachuf Eventually I can consider creating new callbacks for splitting the cases, for example before_update_one
and before_update_many
where weppy make a count()
on the dbset before running the callbacks, but the user should be warned it can be wrong (since a new record can be created or the record can be deleted in the meantime).
Having a before_update_one
might have the option to load changes, eg:
@before_update_one(previous_record=True)
def my_callback(self, dbset, fields, record):
if record.my_field != fields.my_field:
# code
but this is definitely something that requires an accurate analysis and design, and definitely I can't include this in 1.0 :/
This proposal may need to be done in pyDAL. The idea is to inject in the callbacks, or perhaps something else, two objects: old and new
Example:
To insert old is None To delete new is None