M40V / populse_mia

Multiparametric Image Analysis
Other
0 stars 0 forks source link

[CLOSED] Populse_mia crashed during the import #80

Open M40V opened 5 years ago

M40V commented 5 years ago

Issue by blemasso Friday May 10, 2019 at 08:01 GMT _Originally opened as https://github.com/populse/populse_mia/issues/80_


Describe the bug When I try to import Bruker data in a new projet MIA crash and I have the following error. This error arrive right after I click on 'OK' at the popup message "Export completed with success"

Generated error gin-e05-laban:populse_mia blemasso$ python3 main.py Populse_MIA in "developer" mode /Users/blemasso/.populse_mia/configuration.yml has been detected. Checking the installed versions of nipype and mia_processes ...


To Reproduce Steps to reproduce the behavior:

  1. Go to 'File/Import'
  2. Click on 'Brucker', add new data to the basket and click 'export to MIA'
  3. See error
M40V commented 5 years ago

Comment by servoz Friday May 10, 2019 at 08:11 GMT


Hey, thanks for writing in, We are currently working in the history of the repository, we done a rolling back yesterday afternoon. May be this issues is coming from this. Please, wait till this afternoon, the repository will be clean again, and we will try in our side if we observe the same bug !

M40V commented 5 years ago

Comment by servoz Friday May 10, 2019 at 15:35 GMT


With the latest version of the repository, I did not observe this bug. If you have not already done this afternoon, please, make a fresh clone of the repository from GitHub (git clone https://github.com/populse/populse_mia.git # in a terminal). I done the test with the populse_mia/data_tests/Bruker_files/Guerbet-EB/eb140211a.oh1 data (in the populse_mia of the clone). Can you report if you experienced the bug with these data?

M40V commented 5 years ago

Comment by blemasso Friday May 10, 2019 at 16:19 GMT


Ok, It works with the data included in the git! The problem come from another set of data. I attached a link to download it, in order to reproduce the bug https://filesender.renater.fr/?s=download&token=38b80e02-a32f-27c3-5732-6a7aa9dec256

M40V commented 5 years ago

Comment by servoz Saturday May 11, 2019 at 09:19 GMT


OK. I have no access to run populse_mia till next monday. Will see it monday morning.

To be fast as possible, it could be a good idea to open an issue in the populse/mri_conv project. Not yet sure, but it sounds like a problem between mia and mri_conv ?

M40V commented 5 years ago

Comment by servoz Monday May 13, 2019 at 08:19 GMT


I observe exactly the same KeyError exception, with your data:

Exception hooking in progress ...
Clean up before closing mia done ...
Traceback (most recent call last):
  File "/home/econdami/Git_Projects/populse_mia/python/populse_mia/project/controller.py", line 319, in run
    document], flush=False)
  File "/usr/lib/python3.6/site-packages/populse_db-0.0.1-py3.6.egg/populse_db/database.py", line 1397, in add_document
    sql = self.metadata.tables[table].insert()
KeyError: 'list_43b5c9175984c071f30b873fdce0a000_2f382d6e84047502ad3e1639f4f4eea4'
QObject::~QObject: Timers cannot be stopped from another thread

We observe, with these data, a warning message in mri_conv:

Sequences with number of slice orientation > 1 and scan mode = 1D will are not exported

Without no more investigation this can be linked with this issue. We will fix this issue ASAP. Could be a good idea to open also an issue in the mri_conv project.

M40V commented 5 years ago

Comment by servoz Monday May 13, 2019 at 08:28 GMT


Edit: The warning message is not in link with this issue. It only concern the localizer data ...

M40V commented 5 years ago

Comment by servoz Monday May 13, 2019 at 09:25 GMT


with this data set, each experiment can be exported individually but the problem seems to start as soon as we want to import at least 2 experiments...

I succeed to import each experiment one by one. On the other hand if I take the first two experiments, mia crash with this message:

Exception hooking in progress ...

Clean up before closing mia done ...

Traceback (most recent call last):
  File "/usr/lib/python3.6/site-packages/populse_db-0.0.1-py3.6.egg/populse_db/database.py", line 1683, in __getattr__
    return getattr(self._FieldRow__row, name)
AttributeError: '43b5c9175984c071f30b873fdce0a000' object has no attribute 'RepetitionTime'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/econdami/Git_Projects/populse_mia/python/populse_mia/main_window/main_window.py", line 474, in import_data
    self.data_browser.table_data.add_rows(new_scans)
  File "/home/econdami/Git_Projects/populse_mia/python/populse_mia/data_browser/data_browser.py", line 995, in add_rows
    COLLECTION_CURRENT, scan, tag)
  File "/usr/lib/python3.6/site-packages/populse_db-0.0.1-py3.6.egg/populse_db/database.py", line 955, in get_value
    return getattr(document, field, None)
  File "/usr/lib/python3.6/site-packages/populse_db-0.0.1-py3.6.egg/populse_db/database.py", line 1689, in __getattr__
    self.__database.get_field(self.__collection, name).type, result)
  File "/usr/lib/python3.6/site-packages/populse_db-0.0.1-py3.6.egg/populse_db/database.py", line 1628, in __column_to_python
    return DatabaseSession.__column_to_list(column_type, value)
  File "/usr/lib/python3.6/site-packages/populse_db-0.0.1-py3.6.egg/populse_db/database.py", line 1655, in __column_to_list
    list_value = ast.literal_eval(value)
  File "/usr/lib64/python3.6/ast.py", line 85, in literal_eval
    return _convert(node_or_string)
  File "/usr/lib64/python3.6/ast.py", line 84, in _convert
    raise ValueError('malformed node or string: ' + repr(node))
ValueError: malformed node or string: 225.0
M40V commented 5 years ago

Comment by blemasso Monday May 13, 2019 at 11:26 GMT


ok, I opened an issue (linked to this one) in the project Populse/MRI_conv

M40V commented 5 years ago

Comment by servoz Tuesday May 14, 2019 at 08:12 GMT


We currently don't have time to look deeply into it. But we keep in mind this ticket and we will fix it ASAP.

M40V commented 5 years ago

Comment by montigno Thursday May 16, 2019 at 10:28 GMT


The problem seems to be with "EchoTime" an "RepetitionTime" parameters. When we import an experiment with a simple value of EchoTime and an other with multi value of EchoTime, populse_mia crash. Same problem with RepetitonTime. In json file, EchoTime and RepetitonTime show a list for simple value and a array for multiple value. Data_db seems crash if a same parameter change the status (list to array or vice versa).

M40V commented 5 years ago

Comment by servoz Saturday May 18, 2019 at 09:13 GMT


We observe two possibilities of exception raising (ValueError in ast and KeyError in sqlalchemy) with this data set, but which come from the same reason related to the characteristics of tags in the populse_mia's DataBrowser.

We strongly wish to manage the type of each column (field) in the DataBrowser for obvious reasons of controlling the flow of data in pipelines. Thus, some populse_db classes are overloaded in populse_mia in order to assign to a field, when it is created, the type of the corresponding tag in the first scan (document) added to the DataBrowser. That's why, populse_mia crashes when we want to import data containing objects with different structures (with this data, list or list of list types for the same field in the .json files catched from the mri_conv and therefore float or list in populse_mia).

So now, what can we do with it....

The data browser was built with this central concept of tag typing, and I don't see any good reasons to change it (nipype and capsul also use typing very heavily with traits).

One option could be to decide that some parameters are always list of list type. For example, RepetitionTime could be [[2.3]] or [[2.4],[4.2]] in the .json files from mri_conv so in this case in populse_mia it will only be a list type [2.3] or [2.4, 4.2]. So, in this case, we don't change the central dogma of metadata typing in populse_mia and I guess, I'm not totally sure but I'm very confident, that with a simple len() test, it will be easy to manage.

Is it a big job to allow this change in mri_conv?

Is there a problem I forget if we prone this evolution in the .json files generated by mri_conv ?

M40V commented 5 years ago

Comment by montigno Thursday May 23, 2019 at 08:37 GMT


Ok, i can modifiy so that parameters as RepetitionTime or EchoTime are always a list type (always in the double hooks [[2.0]] or [[3.0],[4.0]]). It would not be a problem for MP3. Patience for the next version !

M40V commented 5 years ago

Comment by servoz Thursday May 23, 2019 at 09:35 GMT


As far as I can remember (I'm afraid I lost my notes!), the problem comes, at least, from the RepetitionTime, EchoTime AND InversionTime settings in this dataset.

This change is not insignificant conceptually. As we had no comment on this subject, we can consider that no one has seen any conceptual problem with this idea of defining some parameters always as a list (list of list from mri_conv), even if it is actually only a float (for example) for some datasets.

We can therefore start with these 3 parameters. I think it will be necessary to update the number of parameters that will be with this "special" status ... but we will see according to the dataset processed in the future!

When the change will be made on your side, the cleanest way to do it will be to keep your former mri_conv version in production and we will make the necessary changes in populse_db and populse_mia in a development branch with your new mri_conv ! let us know when you are ready.

M40V commented 5 years ago

Comment by montigno Thursday Jun 06, 2019 at 11:54 GMT


Now mri_conv records all parameters that have floating or integer values as list of list (or an array) in Json file. You can make changes in populse_db.

M40V commented 5 years ago

Comment by servoz Friday Jun 07, 2019 at 13:46 GMT


Change was done in populse_mia and it seems working fine now ! thanks.

M40V commented 5 years ago

Comment by montigno Tuesday Jun 11, 2019 at 07:32 GMT


Good !