I just wanted to drop you a quick note to tell you what a fantastic script this is. Worked absolutely perfect for me shrinking a 64GB image of a Raspberry Pi Kali Linux MicroSD card. I used dcfldd to do the initial imaging (I have found that it seems to image significantly faster than standard dd). This script is so much faster than the manual method that I've been doing for a year or more now (image the card, mount it as a loop device, modprobe, fdisk, calculate, gparted, truncating). I can do it all manually pretty well now but this saves so much time.
I just wanted to let you know and to say Thank You!.
I also wanted to offer a suggestion. I played around with using the compression arguments (I even installed and used pigz) and it worked well but it seemed to take a long time compared to what I have been using.
I stumbled across ZSTD a while ago and have been using it ever since.
compresses my ~16GB shrunken image (shrunken to 16GB from the original 64GB dcfldd image file) down to 6.6GB which is about 200MB less than the pigz compressed file and zstd did it in just over 1 minute (the pigz compression took quite a bit longer).
So my suggestion is to add compressing with zstd as an option. If you do, great, if not no big deal ... I can do it myself.
Hello,
I just wanted to drop you a quick note to tell you what a fantastic script this is. Worked absolutely perfect for me shrinking a 64GB image of a Raspberry Pi Kali Linux MicroSD card. I used dcfldd to do the initial imaging (I have found that it seems to image significantly faster than standard dd). This script is so much faster than the manual method that I've been doing for a year or more now (image the card, mount it as a loop device, modprobe, fdisk, calculate, gparted, truncating). I can do it all manually pretty well now but this saves so much time.
I just wanted to let you know and to say Thank You!.
I also wanted to offer a suggestion. I played around with using the compression arguments (I even installed and used pigz) and it worked well but it seemed to take a long time compared to what I have been using.
I stumbled across ZSTD a while ago and have been using it ever since.
ZSTD actually appears to be part of a number of standard Linux builds (if it's not there it is super easy to add, just like pigz was) and is even getting incorporated into many things within the Linux kernel - https://www.phoronix.com/scan.php?page=news_item&px=Zstd-Updated-For-Kernel-Soon
So this simple command line:
zstd -T0 <name_of_image_file>
compresses my ~16GB shrunken image (shrunken to 16GB from the original 64GB dcfldd image file) down to 6.6GB which is about 200MB less than the pigz compressed file and zstd did it in just over 1 minute (the pigz compression took quite a bit longer).
So my suggestion is to add compressing with zstd as an option. If you do, great, if not no big deal ... I can do it myself.
Thanks again!
Robert