Open nelsonic opened 5 years ago
While attempting to run the tar command:
tar -zcvf logs.tar.gz logs/
as per https://unix.stackexchange.com/questions/93139/can-i-zip-an-entire-folder-using-gzip we received the following error:
tar: logs: file changed as we read it
Now deleting the .gz
files recursively:
find . -name "*.gz" -type f -delete
Temporarily stopping the node.js server in order to perform the archive.
forever stop server.js
Now running:
tar -zcvf logs.tar.gz logs/
And it's taking forever ... ⏳
https://stackoverflow.com/questions/9427553/how-to-download-a-file-from-server-using-ssh Format:
scp your_username@remotehost.edu:foobar.txt /local/dir
Actual:
scp root@178.79.141.232:hits/logs.tar.gz ./logs.tar.gz
Worked:
Ecto refresher: https://geoffreylessel.com/2016/from-zero-to-ecto-in-10-minutes
Processing large files: https://www.poeticoding.com/processing-large-csv-files-with-elixir-streams
https://serverfault.com/questions/264595/can-scp-copy-directories-recursively
scp -rp root@178.79.141.232:hits/logs ./
scp -rp root@185.3.95.195:hits/logs ./
in-progress:
heading to bed let's see if it works overnight.
This is why we can't have nice things:
At present the Node.js MVP saves data to the instance filesystem because that was the simplest way of storing data without having to manage any database. see: hits-nodejs/lib/db_filesystem.js#L26-L68
This worked well for MVP as it streams the contents of the file each time a request is made and counts the lines in the file as the
count
. Node.js shines at this becausefs.createReadStream
is non-blocking. Anyway, with the migration to phoenix, we need to:Tasks
I estimate that this will take me
T4h
because I have to:localhost
.localhost
localhost
hits_dev
PostgreSQL onlocalhost
hits_dev
PostgreSQL onlocalhost