hirak99 / yabsnap

Btrfs Scheduled Snapshot Manager for Arch
Apache License 2.0
62 stars 2 forks source link

Snapshots with trigger I #21

Closed nebulosa2007 closed 10 months ago

nebulosa2007 commented 11 months ago

Can't understand how yabsnap work with snapshots with trigger I, witch are creates by pacman pre hook..

Config is:

$ grep -v "^#" /etc/yabsnap/configs/root.conf
[DEFAULT]

source = /

dest_prefix = /.snapshots/@root-

trigger_interval = 1 hour

min_keep_secs = 1800

keep_user = 1

keep_preinstall = 1

preinstall_interval = 5 minutes

keep_hourly = 5
keep_daily = 1
keep_weekly = 1
keep_monthly = 1
keep_yearly = 1

keep_preinstall is 1

But list of snapshots are:

$ yabsnap list
Config: /etc/yabsnap/configs/root.conf (source=/)
Snaps at: /.snapshots/@root-...
  20231024121406   I   (6 days 23h ago)      pacman --color=always --sync --needed p7zip
  20231024123022  S    (6 days 23h ago)      
  20231025092712   I   (6 days 2h ago)       pacman --color=always --sync --sysupgrade
  20231025200741   I   (5 days 15h ago)      /usr/bin/pacman -U /home/nebulosa/Documents/yabsnap/yabsnap-2.0.7-2-any.pkg.tar.zst
  20231026162630   I   (4 days 19h ago)      pacman --color=always --sync --sysupgrade
  20231026230426   I   (4 days 12h ago)      pacman --color=always --sync --sysupgrade
  20231027120119   I   (3 days 23h ago)      pacman --color=always --sync --sysupgrade
  20231028112115   I   (3 days ago)          pacman --color=always --sync --sysupgrade
  20231029142211   I   (1 day 21h ago)       pacman --color=always --sync --sysupgrade
  20231030092423   I   (1 day 2h ago)        pacman --color=always --sync --sysupgrade
  20231030120211   I   (23h 32m ago)         pacman --color=always --sync --sysupgrade
  20231030193026  S    (16h 4m ago)          
  20231030233958   I   (11h 54m ago)         pacman --color=always --sync --sysupgrade
  20231031092224  S    (2h 12m ago)          
  20231031094105   I   (1h 53m ago)          pacman --color=always --upgrade /home/nebulosa/.cache/pikaur/pkg/yabsnap-2.0.11-1-any.pkg.tar.zst
  20231031103029  S    (1h 4m ago)           
  20231031113006  S    (4m 41s ago) 

What did I miss?

hirak99 commented 11 months ago

Was the keep_preinstall changed after the last pacman installation?

Currently, schedule only affects S trigger. So if keep_preinstall was large so that it saved many snapshots, and then it was reduced, schedule will not clean them up. They should be cleaned up when the next pacman hook runs.

nebulosa2007 commented 10 months ago

Deleted all snapshots, keep_preinstall is still 1, but nothing happened after 2 or more days of working..

$ yabsnap list
Config: /etc/yabsnap/configs/root.conf (source=/)
Snaps at: /.snapshots/@root-...
  20231024123022  S    (1 week 3 days ago)   
  20231030193026  S    (4 days 16h ago)      
  20231102124755   I   (1 day 23h ago)       pacman -Rsn go wireguard-tools yarn
  20231102125525   I   (1 day 23h ago)       /usr/bin/pacman -S --asdeps wireguard-tools
  20231102145345   I   (1 day 21h ago)       pacman -Rsn go wireguard-tools yarn
  20231102160858   I   (1 day 20h ago)       /usr/bin/pacman -S --asdeps wireguard-tools
  20231102175035   I   (1 day 18h ago)       pacman --color=always --sync --needed plasma-disks
  20231102181019   I   (1 day 18h ago)       pacman --color=always --sync --needed gnome-disk-utility
  20231103100913   I   (1 day 2h ago)        pacman --color=always --sync --needed libva-utils
  20231103101712   I   (1 day 1h ago)        pacman --color=always --sync --needed libmfx intel-gmmlib
  20231103161012  S    (20h ago)             
  20231103161111   I   (19h 59m ago)         pacman --color=always --sync --sysupgrade
  20231104114347  S    (26m 32s ago)         
  20231104120841   I   (1m 38s ago)          pacman --color=always --sync --sysupgrade
hirak99 commented 10 months ago

Would it be possible to run -

yabsnap --dry-run --verbose internal-preupdate

This should show the planned operations. We can check if the internal logic being followed includes the subvolume delete command.

For comparison, here is my output -

❯ yabsnap --dry-run --verbose internal-preupdate
2023-11-05 15:42:15 INFO: Reading config /etc/yabsnap/configs/root.conf
2023-11-05 15:42:15 INFO: Schedule is enabled.
2023-11-05 15:42:15 INFO: Reading config /etc/yabsnap/configs/root.conf
2023-11-05 15:42:15 INFO: Running stat -f --format=%T /
2023-11-05 15:42:15 INFO: Running stat --format=%i /
2023-11-05 15:42:15 INFO: Running stat -f --format=%T /
2023-11-05 15:42:15 INFO: Running stat --format=%i /
Would create /.snapshots/root-20231105154215-meta.json: {'source': '/', 'trigger': 'I', 'comment': 'pacman -S nvidia-prime'}
Would run btrfs subvolume snapshot -r / /.snapshots/root-20231105154215
Would run btrfs subvolume delete /.snapshots/root-20231102205543
Would delete /.snapshots/root-20231102205543-meta.json
2023-11-05 15:42:15 INFO: Reading config /etc/yabsnap/configs/home.conf
2023-11-05 15:42:15 INFO: Running stat -f --format=%T /home
2023-11-05 15:42:15 INFO: Running stat --format=%i /home
2023-11-05 15:42:15 INFO: Running stat -f --format=%T /home
2023-11-05 15:42:15 INFO: Running stat --format=%i /home
Would create /.snapshots/home-20231105154215-meta.json: {'source': '/home', 'trigger': 'I', 'comment': 'pacman -S nvidia-prime'}
Would run btrfs subvolume snapshot -r /home /.snapshots/home-20231105154215
Would run btrfs subvolume delete /.snapshots/home-20231104210225
Would delete /.snapshots/home-20231104210225-meta.json
hirak99 commented 10 months ago

Would request to run the command yabsnap --dry-run --verbose internal-preupdate after updating to v2.0.12.

I added a bit more logging.

nebulosa2007 commented 10 months ago

Log

Settings haven't changed.

$ yabsnap list
Config: /etc/yabsnap/configs/root.conf (source=/)
Snaps at: /.snapshots/@root-...
  20231024123022  S    (1 week 5 days ago)   
  20231030193026  S    (5 days 19h ago)      
  20231102124755   I   (3 days 2h ago)       pacman -Rsn go wireguard-tools yarn
  20231102125525   I   (3 days 2h ago)       /usr/bin/pacman -S --asdeps wireguard-tools
  20231102145345   I   (3 days ago)          pacman -Rsn go wireguard-tools yarn
  20231102160858   I   (2 days 22h ago)      /usr/bin/pacman -S --asdeps wireguard-tools
  20231102175035   I   (2 days 21h ago)      pacman --color=always --sync --needed plasma-disks
  20231102181019   I   (2 days 20h ago)      pacman --color=always --sync --needed gnome-disk-utility
  20231103100913   I   (2 days 4h ago)       pacman --color=always --sync --needed libva-utils
  20231103101712   I   (2 days 4h ago)       pacman --color=always --sync --needed libmfx intel-gmmlib
  20231103161111   I   (1 day 22h ago)       pacman --color=always --sync --sysupgrade
  20231104120841   I   (1 day 2h ago)        pacman --color=always --sync --sysupgrade
  20231104133203   I   (1 day 1h ago)        /usr/bin/pacman -S --asdeps wireguard-tools
  20231104153315   I   (23h 26m ago)         pacman -Rsn go wireguard-tools yarn
  20231104173023  S    (21h 29m ago)         
  20231105144440  S    (15m 33s ago)         
  20231105144537   I   (14m 36s ago)         /usr/bin/pacman -Sy
hirak99 commented 10 months ago

From the log, we now know that it is correctly reading the count (as 1), but is not attempting to delete anything.

I wonder if the logic to get the earlier snapshots is not working for some reason.

Would you mind trying this snippet as a python code and check what it returns?

# Name it as findsnaps.py and run.

import os

# Based on your config.
DEST_PREFIX='/.snapshots/@root-'
def show_existing_snaps():
    configdir = os.path.dirname(DEST_PREFIX)
    for fname in os.listdir(configdir):
        print(f"Found {fname}")
        pathname = os.path.join(configdir, fname)
        if not os.path.isdir(pathname):
            print("  Not a path")
            continue
        if not pathname.startswith(DEST_PREFIX):
            print("  Does not begin with prefix")
            continue
        print("Valid.")

if __name__ == "__main__":
    show_existing_snaps()
nebulosa2007 commented 10 months ago

Sure! Here the log: https://0x0.st/HtaZ.txt

hirak99 commented 10 months ago

The logs narrowed it down and I found the bug.

The code was using array[: -k] to get all elements but the last k elements, to determine which of the previous snaps should be removed. But this works only if k > 0, and when k == 0 it returns nothing; so none of the snaps were being returned for removal.

Should be fixed in 2.0.13.

Thank you!

nebulosa2007 commented 10 months ago

Now log is:

~ $ yabsnap --dry-run --verbose internal-preupdate
2023-11-06 10:18:42 INFO: Reading config /etc/yabsnap/configs/root.conf
2023-11-06 10:18:42 INFO: Schedule is enabled.
2023-11-06 10:18:42 INFO: Reading config /etc/yabsnap/configs/root.conf
2023-11-06 10:18:42 INFO: Maintain 1 volumes of type I.
2023-11-06 10:18:42 INFO: Running stat -f --format=%T /
2023-11-06 10:18:42 INFO: Running stat --format=%i /
2023-11-06 10:18:42 INFO: Running stat -f --format=%T /
2023-11-06 10:18:42 INFO: Running stat --format=%i /
Would create /.snapshots/@root-20231106101842-meta.json: {'source': '/', 'trigger': 'I', 'comment': 'pacman --color=always --upgrade /home/nebulosa/.cache/pikaur/pkg/yabsnap-2.0.13-1-any.pkg.tar.zst'}
Would run btrfs subvolume snapshot -r / /.snapshots/@root-20231106101842
Would run btrfs subvolume delete /.snapshots/@root-20231102124755
Would delete /.snapshots/@root-20231102124755-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231102125525
Would delete /.snapshots/@root-20231102125525-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231102145345
Would delete /.snapshots/@root-20231102145345-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231102160858
Would delete /.snapshots/@root-20231102160858-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231102175035
Would delete /.snapshots/@root-20231102175035-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231102181019
Would delete /.snapshots/@root-20231102181019-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231103100913
Would delete /.snapshots/@root-20231103100913-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231103101712
Would delete /.snapshots/@root-20231103101712-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231103161111
Would delete /.snapshots/@root-20231103161111-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231104120841
Would delete /.snapshots/@root-20231104120841-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231104133203
Would delete /.snapshots/@root-20231104133203-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231104153315
Would delete /.snapshots/@root-20231104153315-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231105144537
Would delete /.snapshots/@root-20231105144537-meta.json
Would run btrfs subvolume delete /.snapshots/@root-20231106101119
Would delete /.snapshots/@root-20231106101119-meta.json

I think issue is fixed!