mverleg / pyjson_tricks

Extra features for Python's JSON: comments, order, numpy, pandas, datetimes, and many more! Simple but customizable.
Other
152 stars 23 forks source link

Can't load a dumped object of class scipy.optimize.OptimizeResult #68

Closed eiffleduarte closed 4 years ago

eiffleduarte commented 4 years ago

I was trying to use json_tricks to store optimization results obtained with scipy.optimize.minimize. However, I cannot load the dumped objects with json_tricks.load. I expected to be able to load any file generated with dump seamlessly.

Is there a way to avoid this error? Did I do something wrong?

Example (using example code from https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html#scipy.optimize.minimize)


>>> from json_tricks import dump, load
>>> x0 = [1.3, 0.7, 0.8, 1.9, 1.2]
>>> res = minimize(rosen, x0, method='Nelder-Mead', tol=1e-6)
>>> print(res)
 final_simplex: (array([[1.00000002, 1.00000002, 1.00000007, 1.00000015, 1.00000028],
       [0.99999999, 0.99999996, 0.99999994, 0.99999986, 0.99999971],
       [1.00000005, 1.00000007, 1.00000017, 1.00000031, 1.00000063],
       [1.00000004, 1.00000008, 1.00000013, 1.00000025, 1.00000047],
       [0.99999999, 0.99999996, 0.99999994, 0.99999984, 0.99999963],
       [1.00000005, 1.00000004, 1.00000003, 1.00000003, 1.00000004]]), array([1.94206402e-13, 2.44964782e-13, 3.10422870e-13, 3.37952410e-13,
       5.52173609e-13, 7.16586838e-13]))
           fun: 1.9420640199868412e-13
       message: 'Optimization terminated successfully.'
          nfev: 494
           nit: 295
        status: 0
       success: True
             x: array([1.00000002, 1.00000002, 1.00000007, 1.00000015, 1.00000028])

>>> dump(res,open("teste.json","w"))
'{"fun": 1.9420640199868412e-13, "nit": 295, "nfev": 494, "status": 0, "success": true, "message": "Optimization terminated successfully.", "x": {"__ndarray__": [1.0000000163145237, 1.000000017346742, 1.0000000669109803, 1.00000014831607, 1.000000282846266], "dtype": "float64", "shape": [5]}, "final_simplex": [{"__ndarray__": [[1.0000000163145237, 1.000000017346742, 1.0000000669109803, 1.00000014831607, 1.000000282846266], [0.9999999937629743, 0.9999999579594641, 0.9999999408311967, 0.999999855487161, 0.9999997137662322], [1.0000000481146265, 1.0000000696881115, 1.000000165765766, 1.0000003127419719, 1.000000630560188], [1.0000000352355618, 1.0000000815697203, 1.0000001262754896, 1.0000002487813209, 1.0000004655337267], [0.9999999940610255, 0.9999999622842339, 0.9999999441724164, 0.9999998368570097, 0.9999996348948336], [1.0000000483446105, 1.000000037420985, 1.000000033633306, 1.0000000264835782, 1.0000000376037868]], "dtype": "float64", "shape": [6, 5], "Corder": true}, {"__ndarray__": [1.9420640199868412e-13, 2.4496478242923324e-13, 3.1042287042339975e-13, 3.3795240978762425e-13, 5.521736090599669e-13, 7.165868377873191e-13], "dtype": "float64", "shape": [6]}]}'
>>> res2 = load("teste.json")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/ctp8/opt/anaconda3/lib/python3.7/site-packages/json_tricks/nonp.py", line 237, in load
    allow_duplicates=allow_duplicates, conv_str_byte=conv_str_byte, **jsonkwargs)
  File "/Users/ctp8/opt/anaconda3/lib/python3.7/site-packages/json_tricks/nonp.py", line 205, in loads
    'string to `load(s)`, for example bytevar.encode("utf-8") if utf-8 is the encoding.').format(type(string)))
TypeError: Cannot automatically encode object of type "<class 'bytes'>" in `json_tricks.load(s)` since the encoding is not known. You should instead encode the bytes to a string and pass that string to `load(s)`, for example bytevar.encode("utf-8") if utf-8 is the encoding.```
mverleg commented 4 years ago

Thanks for reporting!

For now you can work around it by passing conv_str_byte=True to load.

Although I haven't experimented with using open for load but a string for dump, it looks like it should work. So my impression is that it should work without the conv_str_byte=True workaround. I'll see what I can do.

mverleg commented 4 years ago

Another workaround, somewhat amusingly, is to enable compression, since that forces binary mode. It might also save space if the result matrix is bigger than the example.

dump(res, open("test.json","wb"), compression=True)
mverleg commented 4 years ago

Version 3.14.0 solved the issue, and added some tests.

It seems like this issue must've affected a number of people, so thanks for reporting it, I'm glad it's finally solved!

eiffleduarte commented 4 years ago

I upgraded to 3.14.0 and everything works fine.

Thanks for fixing it so fast and even updating the conda package!

mverleg commented 4 years ago

Thanks @eiffleduarte though I think @jhkennedy maintains the Conda package (#49), not me

jhkennedy commented 4 years ago

@eiffleduarte you're welcome on the conda package! Though, really, the conda-forge bot does most of the work, I just do a quick check and hit merge most of the time.