Closed eddowh closed 8 years ago
I was thinking - when we want to write a list of records into the DB, why don't we check the list for duplicates and remove them before committing, like such:
[(datetime, uuid, key, value) , ... ]
corresponding to records.datetime
, uuid
, and key
(which are all primary keys when we define the Record
schema in SQLAchemy)Record
objects [Record<ts, uuid, key, value> , ...]
By now, the list of Record
objects can be added and committed to the database, which is simply done with:
session.add_all(list_of_Record_objects)
session.commit()
This way, we don't have to atomize the list of records when trying to commit them. It'd be faster and I suppose the code will also be more readable.
You can't know if duplicates are all in the data about to be pushed so let the db deal with that and handle the situation.
Method implementation is outdated due to inconsistency with data format of input. See #30.
Closing.
Highly recommended to bump version after merging.