Closed GoogleCodeExporter closed 9 years ago
Can you share your shell script with us?
Original comment by devraj
on 24 Oct 2009 at 10:18
If the cron job is being run as a cron job then the shell script may have a
restricted amount of information
about the environment such as system encoding resulting in
sys.getfilesystemencoding
not resolving into anything.
Original comment by devraj
on 24 Oct 2009 at 10:54
Could be, but I have no idea how to correct that. :)
Strange thing is that for the first few files it was OK. Altough I have accented
characters in some of the file names, so this could be a reason.
My backup2l.conf script is quite long, the relevant parts are:
echo "starting GDocs backup"
sudo -u eszpee /home/eszpee/gdatacopier/gdatacopier-2.0.1/startbackup.sh
and the startbackup.sh script only contains this:
#!/bin/bash
/home/eszpee/gdatacopier/gdatacopier-2.0.1/gcp.py -p ******** -o -u
eszpeebackup@gmail.com:/all/all/*
/home/eszpee/gdatacopier/gdatacopier-2.0.1/files/
is there anything else I can provide?
Original comment by esz...@gmail.com
on 24 Oct 2009 at 11:38
You were right devraj, it was indeed the problem with the shell environment.
The following line in my backup script solved everything:
export LANG=en_US.UTF-8;
Maybe you could fall back to UTF-8 if you don't find any filesystem encoding,
but
that's just a nice touch, this is clearly not the fault of the script. Thanks
for the
hint.
Original comment by esz...@gmail.com
on 28 Oct 2009 at 2:45
Thanks for reporting that in. I might actually add an error message if there
ins't a LANG variable set, defaulting
UTF-8 might make it crash for people who require non-UTF8 support.
Original comment by devraj
on 28 Oct 2009 at 10:42
Included check for encoding in environment variables.
Original comment by devraj
on 29 Oct 2009 at 8:32
Great, let me know if I can help testing.
Original comment by esz...@gmail.com
on 29 Oct 2009 at 8:38
Original issue reported on code.google.com by
esz...@gmail.com
on 24 Oct 2009 at 8:44