Open GoogleCodeExporter opened 9 years ago
I know this macos isn't "supported" but maybe we can try and fix this one.
I think the issue results from the offset defaulting to the default value set
in boar/workdir.py: 624 in load_workdir_parameters(), which is a string and not
a unicode object:
def load_workdir_parameters():
metapath = find_meta(tounicode(os.getcwd()))
if not metapath:
return None
info = load_meta_info(metapath)
root = os.path.split(metapath)[0]
return {"repoUrl": info['repo_path'],
"sessionName": info['session_name'],
"offset": info.get("offset", ""), ####################### look here #######
"revision": info['session_id'],
"root": root}
I seemed to have fixed this problem by inserting this code into the __init__
for the Workdir class:
boar/workdir.py: line 53:
was -
assert isinstance(offset, unicode)
now -
if not isinstance(offset, unicode):
offset = unicode(offset)
assert isinstance(offset, unicode)
This seems to fix the issue. Hopefully without screwing anything else up :)
Original comment by jeff.kno...@gmail.com
on 12 Jul 2012 at 9:20
Hello Jeff. Unfortunately I cannot replicate this problem on my (ancient)
10.2.0 macbook with python 2.6. I'll have to depend on you to help me fix this.
There is certainly a bug in the row that you point out in you comment. That
default value should be u"". I'll fix that. However, that default value should
only be used when accessing truly ancient work directories. A brand new work
directory should certainly have an "offset" value in its properties. Feel free
to change the default value to u"" and see if that affects the problem.
I'm a bit suspicious about your Python version, 2.7.1. There is a known bug in
that release that affects boar, see issue 72. Boar always tries to check for
that python bug and will exit with an error message if it is found. But if that
check passes erroneously for some reason, I would expect to see your exact
problems. The problem is that json strings should always be unicode, but in
python 2.7.1 they are only unicode for strings containing non-ascii characters.
Therefore, if possible, I'd like you to try to upgrade to python 2.7.2 or
2.7.3.
Your suggested fix would probably work fine, but I'd be uncomfortable with
fixing up faulty parameters without knowing why they are wrong to begin with.
Original comment by ekb...@gmail.com
on 12 Jul 2012 at 11:35
Hey,
Thanks for getting back to me on this. Looking back at this, I found that
the default value wasn't being used. Instead the offset value is coming
from load_meta_data() and so the offset is coming from workdir/info.
I put some prints in:
def load_workdir_parameters():
metapath = find_meta(tounicode(os.getcwd()))
if not metapath:
return None
info = load_meta_info(metapath)
print info
print "info['offset']:", info['offset'], " type(info['offset']):",
type(info['offset'])
root = os.path.split(metapath)[0]
And got
{u'session_name': u'saribi_data', u'repo_path': u'boar+ssh://
prod@78.46.68.134/srv/www/saribi_repo', u'session_id': 3, u'offset': ''}
info['offset']: type(info['offset']): <type 'str'>
I think This should be easy to fix in one of two ways:
1) If you want to keep everything in unicode, you could put a param =
unicode(param) for the data read in by load_meta_data(). This one fixes
the issue for me:
def load_meta_info(metapath):
assert metapath
with safe_open(os.path.join(metapath, "info"), "rb") as f:
info = json.load(f)
for key in info:
if isinstance(info[key], basestring):
info[key]=unicode(info[key])
return info
2) Alternatively you could change your assertions to basestring which is a
superclass of unicode and str:
assert isinstance(offset, unicode) -> assert isinstance(offset, basestring)
Note that there are further assertions that are breaking on the offset. I
would guess that if you pick this option you would want to change it
throughout the code.
Anyway thanks for getting back to me. If you would like I can check in a
branch with this code, but it should be easy enough to replicate yourself.
Thanks for making boar, its really useful! I just finished getting it set
up and I think its going to be great for my project. I love utilities like
this coded in python!
The only drawback right now is the inability to diff large files. I'm
hosting a repository on my data server and it takes a long time
to propagate any changes to a large file to a local clone or workdir, even
if the change is very minor. Many of my data are text fixtures in JSON,
but I could imagine something general that diffs on text when possible, but
could also handle binary files, ext. Maybe something to think about for
the future!
Anyway great work!
JK
Original comment by jeff.kno...@gmail.com
on 13 Jul 2012 at 5:03
Original issue reported on code.google.com by
jeff.kno...@gmail.com
on 12 Jul 2012 at 8:30