Backblaze / B2_Command_Line_Tool

The command-line tool that gives easy access to all of the capabilities of B2 Cloud Storage
Other
544 stars 126 forks source link

b2sdk._internal.exception.BadRequest: No active upload for: .... #1050

Open program-the-brain-not-the-heartbeat opened 4 weeks ago

program-the-brain-not-the-heartbeat commented 4 weeks ago

I have a script that backups 2 files (a file backup and a MariaDB SQL backup) in multiple /home folders (about 100) which succeeds in most of the cases but maybe like 25% of them fail with the following error. I believe this only started recently.

I will update to 4.x branch if necessary but I also wouldn't mind staying on 3.19.1 for now.

Versions: Ubuntu 20.04.6 LTS b2 command line tool, version 3.19.1

Command:

 /usr/local/bin/b2 sync ${LOCAL_BACKUP_FOLDER}/ b2://${DIRECTORY_B2}/${USER_NAME}/ --keep-days=186 --replace-newer

Output: I've obscured some of the personal information like file IDs and file names.

hide   *******.sql.tar.gz                               
upload *******.sql.tar.gz.enc
ERROR:b2sdk._internal.sync.action:an exception occurred in a sync action
Traceback (most recent call last):
  File "b2sdk/_internal/sync/action.py", line 55, in run
  File "b2sdk/_internal/sync/action.py", line 167, in do_action
  File "logfury/_logfury/trace_call.py", line 86, in wrapper
  File "b2sdk/_internal/bucket.py", line 1216, in concatenate
  File "logfury/_logfury/trace_call.py", line 86, in wrapper
  File "b2sdk/_internal/bucket.py", line 989, in create_file
  File "b2sdk/_internal/bucket.py", line 1136, in _create_file
  File "logfury/_logfury/trace_call.py", line 86, in wrapper
  File "b2sdk/_internal/transfer/emerge/emerger.py", line 173, in emerge
  File "b2sdk/_internal/transfer/emerge/emerger.py", line 115, in _emerge
  File "b2sdk/_internal/transfer/emerge/executor.py", line 82, in execute_emerge_plan
  File "b2sdk/_internal/transfer/emerge/executor.py", line 231, in execute_plan
  File "concurrent/futures/_base.py", line 456, in result
  File "concurrent/futures/_base.py", line 401, in __get_result
  File "concurrent/futures/thread.py", line 58, in run
  File "b2sdk/_internal/transfer/outbound/upload_manager.py", line 170, in _upload_part
  File "b2sdk/_internal/session.py", line 382, in upload_part
  File "b2sdk/_internal/session.py", line 456, in _wrap_token
  File "b2sdk/_internal/session.py", line 471, in _reauthorization_loop
  File "b2sdk/_internal/session.py", line 544, in _upload_part
  File "b2sdk/_internal/raw_api.py", line 1070, in upload_part
  File "b2sdk/_internal/b2http.py", line 356, in post_content_return_json
  File "b2sdk/_internal/b2http.py", line 309, in request_content_return_json
  File "b2sdk/_internal/b2http.py", line 279, in request
  File "b2sdk/_internal/b2http.py", line 588, in _translate_and_retry
  File "b2sdk/_internal/b2http.py", line 520, in _translate_errors
b2sdk._internal.exception.BadRequest: No active upload for: ***** (bad_request)
b2_upload(/home/*****/private/backups/******.tar.gz.enc, **/****.tar.gz.enc, 1728979337700): BadRequest() No active upload for: ***** (bad_request)
ERROR: Incomplete sync: sync is incomplete
ppolewicz commented 4 weeks ago

Is it possible that a large file upload session has been cancelled when the sync was uploading parts?

program-the-brain-not-the-heartbeat commented 3 weeks ago

I'm not entirely sure as the scripts run through cronjob every night.

After looping through all directories, it does issue a b2 cancel-all-unfinished-large-files <BUCKET> command.

I've also executed that cancel-all-unfinished-large-files command and attempted to manually run the script, and it failed again with same error.

b2 list-unfinished-large-files <BUCKET> returns empty.