Open bbengfort opened 8 years ago
At this point we now have a new "versioning" scheme, wherein every dataset has its own unique version that you can download and go back in time to see. This is definitely a more advanced usage, and related to this issue; but further thought is going to be required. As such, I'm moving this issue back into the backlog.
This will be resolved by #59
Right now if you upload a duplicate file, the file is modified on S3 - e.g. its "last modified" timestamp changes. We need to ask some important questions for data management:
We should make sure that a dataset cannot be overridden if someone uploads a different dataset with the same name.