Click Upload Button
The user initiates the upload by clicking the upload button.
Select Files
The user selects the files they want to upload.
Upload Starts
The selected files begin uploading to the server.
Upload Ends
Once the upload is complete:
An object representing the file is added to the Postgres "files" table:
Linked to the current sessionId.
Contains the necessary information to retrieve the file.
The "files" endpoint is invalidated:
The fileList is automatically updated with the newly uploaded file.
How to Handle Expired Files
All uploaded files will have an expiration date, after which they will no longer be stored. This can be achieved using the following methods:
S3 Object Lifecycle
Leverage S3's lifecycle rules to automatically delete objects after they reach a certain age (X days old).
This would trigger an event that also deletes the object entry from Postgres.
The main challenge is figuring out how to integrate this event into the project cleanly, ensuring it isn't just an add-on, but a well-integrated feature.
GitHub Actions or Cron Job
Use a periodic process (such as a cron job) to fetch and delete files that are X days old:
Remove them from the S3 bucket.
Simultaneously remove the corresponding entries from Postgres.
This approach is beneficial as it is more encapsulated within the project, requiring fewer AWS-specific configurations.
File Upload Flow
Click Upload Button
The user initiates the upload by clicking the upload button.
Select Files
The user selects the files they want to upload.
Upload Starts
The selected files begin uploading to the server.
Upload Ends
Once the upload is complete:
sessionId
.fileList
is automatically updated with the newly uploaded file.How to Handle Expired Files
All uploaded files will have an expiration date, after which they will no longer be stored. This can be achieved using the following methods:
S3 Object Lifecycle
GitHub Actions or Cron Job