treasure-data / digdag

Workload Automation System
https://www.digdag.io/
Apache License 2.0
1.3k stars 221 forks source link

Long file error while archiving logs #162

Closed krt closed 8 years ago

krt commented 8 years ago

I faced with the error when I tried to update my workflow using digdag push default -r "$(git show --pretty=format:'%T' | head -n 1)"

The error is as below:

2016-07-07 17:18:39 +0900: Digdag v0.8.3
Creating .digdag/tmp/archive-4904269853204965434.tar.gz...
  Archiving log/2016-07-06/0.1daily_batch@20160707T000000+0900_bd59fcda-cda6-4e6e-a6d8-dbeb840d9fdb/+daily_batch+vac
uum_reindex_and_cache_update+vacuum_reindex+vacuum_reindex_end_notify@577debe02d1ee740.8483@xxxxxxx.log.gz
error: file name 'log/2016-07-06/0.1daily_batch@20160707T000000+0900_bd59fcda-cda6-4e6e-a6d8-dbeb840d9fdb/+daily_bat
ch+vacuum_reindex_and_cache_update+vacuum_reindex+vacuum_reindex_end_notify@577debe02d1ee740.8483@xxxxxxx.log.gz'
 is too long ( > 100 bytes)

Finally, I shut down the digdag server and cleared build-in db files and log files, restarted server and re-tried pushing new workflow, fixed the problem.

I wonder if anything wrong around TarArchiveOutputStream. Any comments would be appreciated. Thanks.

frsyuki commented 8 years ago

TarArchiveOutputStream doesn't support file names longer than 100 bytes by default, apparently. It needs to enable one of long file modes (POSIX, GNU, or truncate)...I'll check it. Thanks for reporting!

krt commented 8 years ago

Thx for quick response!