Open valentinocossar opened 8 years ago
Can someone help me? :(
Hi,
I have found today the same problem (on a linux platform). Following the code I have noticed the problem was on (lib/capistrano/tasks/wpdb.rake) at the "download!" call
desc "Backup the remote database"
task :remote do
on roles(:web) do
within release_path do
execute :wp, :db, :export, "- |", :gzip, ">", fetch(:wpcli_remote_db_file)
download! fetch(:wpcli_remote_db_file), Pathname.new(fetch(:wpcli_local_db_backup_dir)).join(fetch(:wpcli_local_db_backup_filename))
execute :rm, fetch(:wpcli_remote_db_file)
end
end
end
Which caused the Pathname.write() to eventually being called for every 32k of data or less (overwriting and not appending to previous data). I am not a ruby programmer myself and looking at the Pathname documentation I found no clues. So I changed the previous task to use the File object and it just worked:
desc "Backup the remote database"
task :remote do
on roles(:web) do
within release_path do
execute :wp, :db, :export, "- |", :gzip, ">", fetch(:wpcli_remote_db_file)
download! fetch(:wpcli_remote_db_file), File.join(fetch(:wpcli_local_db_backup_dir), fetch(:wpcli_local_db_backup_filename))
execute :rm, fetch(:wpcli_remote_db_file)
end
end
end
I don't know if this is the right solution, but just in case it helps somebody...
I've found the (supposed) solution, you've to add .to_s
after .join(fetch(:wpcli_local_db_backup_filename))
at the end of this line download! fetch(:wpcli_remote_db_file), Pathname.new(fetch(:wpcli_local_db_backup_dir)).join(fetch(:wpcli_local_db_backup_filename))
.
I think this is the right solution, but I'm not sure!
Yes that's a correct solution as well. The problem I followed led to the net-scp gem where the download!
second argument was checked to honour the :write
method (see link).
So if the parameter is:
:write
but as far as I can see each write overwrites the previous...So I guess both solutions led to the same result :-)
Thanks for sharing your solution!
Thank you for the explanation and for sharing your solution 😁
Hi guys!
I'm experiencing some trouble with backup:remote function. The exported remote .sql.gz file in /tmp folder is perfect, but when download! command downloads the db dump file, locally I find a 15KB file named db_stage_timestamp.sql.gz that's corrupted.
When I try to extract the file with a double click on it, the Mac's Compression Utility returns me a file with a .sql.gz 2.cpgz extension, and if i try to extract the .sql.gz 2.cpgz file, the Mac's Compression Utility returns me a file with a .sql 2.gz extension and so on.
I also tried to extract it with gunzip command, but returns me that error:
gunzip: db_stage_timestamp.sql.gz: not in gzip format
I think that the problem isn't the download! command itself, but maybe the Pathname.new() or something else in the backup:remote function, that's because push and pull (that use the downlaod! command) work well without any kind of problem.
Mac OS X version:
10.11.5
Rbenv version:
1.0.0
Ruby version:
2.3.1p112 (2016-04-26 revision 54768) [x86_64-darwin15]
Here you can find all the gems installed on my Mac:
Any suggestion? Thank you.