scidash / neuronunit

A package for data-driven validation of neuron and ion channel models using SciUnit
http://neuronunit.scidash.org
38 stars 24 forks source link

This is an error I am getting in the context of large exhaustive search runs. #144

Closed russelljjarvis closed 6 years ago

russelljjarvis commented 6 years ago

I don't understand what is causing it.

I have disabled include_includes to check if this helps.

        more_attributes = pynml.read_lems_file(self.model.orig_lems_file_path,
                                               include_includes=False,
                                               debug=False)
---------------------------------------------------------------------------SystemExit                                Traceback (most recent call last)<string> in <module>()
/opt/conda/lib/python3.5/site-packages/ipyparallel/client/remotefunction.py in <lambda>(f, *sequences)
    248             if _mapping:
    249                 if sys.version_info[0] >= 3:
--> 250                     f = lambda f, *sequences: list(map(f, *sequences))
    251                 else:
    252                     f = map
/opt/conda/lib/python3.5/site-packages/ipyparallel/controller/dependency.py in __call__(self, *args, **kwargs)
     71 
     72     def __call__(self, *args, **kwargs):
---> 73         return self.f(*args, **kwargs)
     74 
     75     if not py3compat.PY3:
~/neuronunit/neuronunit/tests/fi.py in check_current(ampl, dtc)
    307             #import pdb; pdb.set_trace()
    308             DELAY = 100.0*pq.ms
--> 309             DURATION = 1000.0*pq.ms
    310             params = {'injected_square_current':
    311                       {'amplitude':100.0*pq.pA, 'delay':DELAY, 'duration':DURATION}}
~/neuronunit/neuronunit/models/reduced.py in __init__(self, LEMS_file_path, name, backend, attrs)
     22         """
     23         super(ReducedModel,self).__init__(LEMS_file_path,name=name,
---> 24                                           backend=backend,attrs=attrs)
     25         self.run_number = 0
     26         self.tstop = None
~/neuronunit/neuronunit/models/__init__.py in __init__(self, LEMS_file_path, name, backend, attrs)
     42         self.rerun = True # Needs to be rerun since it hasn't been run yet!
     43         self.unpicklable = []
---> 44         self.set_backend(backend)
     45 
     46     def get_backend(self):
~/neuronunit/neuronunit/models/__init__.py in set_backend(self, backend)
     81                             % name)
     82         self._backend.model = self
---> 83         self._backend.init_backend(*args, **kwargs)
     84 
     85     def get_nml_paths(self, lems_tree=None, absolute=True, original=False):
~/neuronunit/neuronunit/models/backends.py in init_backend(self, attrs, cell_name, current_src_name, DTC)
    295         self.lookup = {}
    296 
--> 297         super(NEURONBackend,self).init_backend()
    298         self.model.unpicklable += ['h','ns','_backend']
    299 
~/neuronunit/neuronunit/models/backends.py in init_backend(self, *args, **kwargs)
     44         if self.use_disk_cache:
     45             self.init_disk_cache()
---> 46         self.load_model()
     47         self.model.unpicklable += ['_backend']
     48 
~/neuronunit/neuronunit/models/backends.py in load_model(self, verbose)
    484         more_attributes = pynml.read_lems_file(self.model.orig_lems_file_path,
    485                                                include_includes=True,
--> 486                                                debug=False)
    487         for i in more_attributes.components:
    488             #This code strips out simulation parameters from the xml tree also such as duration.
/opt/conda/lib/python3.5/site-packages/pyneuroml/pynml.py in read_lems_file(lems_file_name, include_includes, debug)
    431     if not os.path.isfile(lems_file_name):
    432         print_comment("Unable to find file: %s!"%lems_file_name, True)
--> 433         sys.exit()
    434 
    435     model = lems_model.Model(include_includes=include_includes)
rgerkin commented 6 years ago

Could you verify from the worker stdout that the lems_file_name is the same as the lems_file_name you are expecting (i.e. a real file on disk that exists). You may need to edit your pynml.py source to ensure that print_comment actually prints (I think there is a flag that turns it on and off... this may be the debug argument in the function you can see just above it in the stack. Then we would know if it is failing to read an actually existing file, or if it is for some reason looking for a file which does not actually exist.

russelljjarvis commented 6 years ago

AOK. I have made sure Verbose is true, and I am running a big run again.

russelljjarvis commented 6 years ago

I have not had this error for a while. I am unsure if it still applies.