18.11.2024 - Srdjan
I did a little bit of research on best practice how to upload new version on filebase and always keep the backup.
Here is what I've found out we should do: We should create backup backet and new-prod backet. every deployment should first try to move new-prod backet to backup backet, and then (if successfull) upload new file to new-prod backet.
Also doing a little bit research on filebase API in order to do this trough github action.
19.11.2024 - Srdjan
Newest decision how we are going to upload files is to add timestamp+github hash and upload new file on each deployment.
Tried to implement this in github actions,
ran into few issues. and resolved few of them
[x] run github actions locally
[ ] login to aws cli
[x] setup docker
[x] create .env in docker environment
[x] resolve secret key issue
[x] create new file name variable
[x] run build on same node version
[ ] try S3md instead of aws because of unknown error
[ ] pas host to S3md and try to login there
20.11.2024 - Srdjan
-okay after trying S3md I gave up from that solution as well due to lack of logs and number of errors. (aws is best choice anyway, but i wanted to try S3md just for learning purposes.)
-went back to aws, but this time used curl to install aws cli instead of using aws actions, managed to login to aws.
-managed to upload a test file to my filestore.
now all that is left is to remove all hardcoded values and cleanup the action and everything should be fine.
21.11.2024 - Srdjan
[x] removed all hardcoded values
[x] updated readme
[x] cleaned up
[x] added correct env values
[x] tested
22.11.2024 - Srdjan
[x] test action on a fork
[x] add secrets to fork and configure repo
[x] remove aws installation to ubuntu
[ ] finish up small changes and push it to main repo
18.11.2024 - Srdjan I did a little bit of research on best practice how to upload new version on filebase and always keep the backup. Here is what I've found out we should do: We should create backup backet and new-prod backet. every deployment should first try to move new-prod backet to backup backet, and then (if successfull) upload new file to new-prod backet.
Also doing a little bit research on filebase API in order to do this trough github action.
19.11.2024 - Srdjan
Newest decision how we are going to upload files is to add timestamp+github hash and upload new file on each deployment. Tried to implement this in github actions, ran into few issues. and resolved few of them
[x] run github actions locally [ ] login to aws cli [x] setup docker [x] create .env in docker environment [x] resolve secret key issue [x] create new file name variable [x] run build on same node version [ ] try S3md instead of aws because of unknown error [ ] pas host to S3md and try to login there
20.11.2024 - Srdjan
-okay after trying S3md I gave up from that solution as well due to lack of logs and number of errors. (aws is best choice anyway, but i wanted to try S3md just for learning purposes.) -went back to aws, but this time used curl to install aws cli instead of using aws actions, managed to login to aws. -managed to upload a test file to my filestore. now all that is left is to remove all hardcoded values and cleanup the action and everything should be fine.
21.11.2024 - Srdjan
[x] removed all hardcoded values [x] updated readme [x] cleaned up [x] added correct env values [x] tested
22.11.2024 - Srdjan
[x] test action on a fork [x] add secrets to fork and configure repo [x] remove aws installation to ubuntu [ ] finish up small changes and push it to main repo