choderalab / assaytools

Modeling and Bayesian analysis of fluorescence and absorbance assays.
http://assaytools.readthedocs.org
GNU Lesser General Public License v2.1
18 stars 11 forks source link

Updating quickmodel with more options and to use parser function. #97

Closed sonyahanson closed 7 years ago

sonyahanson commented 7 years ago

Updating quickmodel with a few more options and so that it now uses parser.py, which I've been working on for Greg and Mehtap to be usable outside of quickmodel.

Options to add:

sonyahanson commented 7 years ago

Okay, so it turns out that the things that I thought would be easy to get done before the more advanced changes above took a bit longer than expected. What HAS been done here:

The other things will have to wait for now, but were added to #72, see discussion on #91 for more about 'adding Mehtap's plate format'.

Mergeable if tests are passing.

jchodera commented 7 years ago

Looks like there is a py3 issue with the JSON serialization: https://travis-ci.org/choderalab/assaytools/jobs/228916368#L671-L688

The failure seems to be caused by the json.dump of the metadata:

Traceback (most recent call last):
  File "/home/travis/miniconda/envs/test/bin/quickmodel", line 11, in <module>
    load_entry_point('assaytools==0.2.0', 'console_scripts', 'quickmodel')()
  File "/home/travis/miniconda/envs/test/lib/python3.4/site-packages/assaytools/scripts/quickmodel.py", line 241, in entry_point
    quick_model(inputs, nsamples=args.nsamples, nthin=args.nthin)
  File "/home/travis/miniconda/envs/test/lib/python3.4/site-packages/assaytools/scripts/quickmodel.py", line 211, in quick_model
    json.dump(metadata, outfile, sort_keys = True, indent = 4, ensure_ascii=False)
  File "/home/travis/miniconda/envs/test/lib/python3.4/json/__init__.py", line 178, in dump
    for chunk in iterable:
  File "/home/travis/miniconda/envs/test/lib/python3.4/json/encoder.py", line 422, in _iterencode
    yield from _iterencode_dict(o, _current_indent_level)
  File "/home/travis/miniconda/envs/test/lib/python3.4/json/encoder.py", line 396, in _iterencode_dict
    yield from chunks
  File "/home/travis/miniconda/envs/test/lib/python3.4/json/encoder.py", line 429, in _iterencode
    o = _default(o)
  File "/home/travis/miniconda/envs/test/lib/python3.4/json/encoder.py", line 173, in default
    raise TypeError(repr(o) + " is not JSON serializable")
TypeError: 0 is not JSON serializable
jchodera commented 7 years ago

My best guess is that the error we see:

is due to a 0 that is not a normal Python int but a np.int64 that isn't automatically being converted to Python int. For example, in python 3.5:

mski1776:benchmark choderaj$ python
Python 3.5.3 |Continuum Analytics, Inc.| (default, Mar  6 2017, 12:15:08) 
[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import json
>>> import numpy as np
>>> outfile = open('test.json', 'w')
>>> # This works fine:
>>> json.dump(0, outfile, sort_keys = True, indent = 4, ensure_ascii=False)
>>> # This breaks:
>>> json.dump(np.int64(0), outfile, sort_keys = True, indent = 4, ensure_ascii=False)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/choderaj/miniconda3/lib/python3.5/json/__init__.py", line 178, in dump
    for chunk in iterable:
  File "/Users/choderaj/miniconda3/lib/python3.5/json/encoder.py", line 436, in _iterencode
    o = _default(o)
  File "/Users/choderaj/miniconda3/lib/python3.5/json/encoder.py", line 179, in default
    raise TypeError(repr(o) + " is not JSON serializable")
TypeError: 0 is not JSON serializable
jchodera commented 7 years ago

I can't tell if it's inputs or outputs that contains the unserializable thing. You'll have to double-check to make sure nothing in there is still a numpy object before JSON serializing it.

sonyahanson commented 7 years ago

I think this should fix it, but wasn't able to reproduce the error in a python3 environment, so not sure. Will just have to wait!

sonyahanson commented 7 years ago

Checks are passing. Merging.