Open bcasillas opened 7 years ago
should be fixed in cd8a1f1
(added estimate for cache size in bg_computer to allow in place assembly without database. Cachetools doesn't support infinite cache size...)
could you please try it and leave me a message about result?
commit cd8a1f1
Traceback (most recent call last):
File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
"main", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/nfs/m8/home/casillas/ferda/FERDA/scripts/export/export_part.py", line 195, in
Traceback (most recent call last):
File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
"main", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/nfs/m8/home/casillas/ferda/FERDA/scripts/export/export_part.py", line 196, in
commit: bc904ce can you include module in FERDA? or should we install in all nodes?
Traceback (most recent call last):
File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
"main", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/nfs/m8/home/casillas/ferda/FERDA/scripts/export/export_part.py", line 196, in
If there is h5py installed on server (easy check will be run python and try import h5py). I can include library into repository, if not, we need to install this (or I can change to pickle, but for these cases hickle is much much faster).
h5py seems to be installed
ok then... Commit d877c9b should work for you.
still couldn't import hickle
Traceback (most recent call last):
File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
"main", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/nfs/m8/home/casillas/ferda/FERDA/scripts/export/export_part.py", line 196, in
sklearn version 0.14.1
commit 1e8103e with error: core/region/region_manager.py:31: UserWarning: cache size limit -1 - infinity is not supported right now!!! warnings.warn("cache size limit -1 - infinity is not supported right now!!!") Traceback (most recent call last): File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main "main", fname, loader, pkg_name) File "/usr/lib/python2.7/runpy.py", line 72, in _run_code exec code in run_globals File "/nfs/m8/home/casillas/ferda/FERDA/scripts/export/export_part.py", line 195, in
assembly_after_parallelization(bgcomp)
File "core/bg_computer_assembling.py", line 73, in assembly_after_parallelization
mergeparts(bgcomp.project.gm, g, relevant_vertices, bgcomp.project, rmold, chm)
File "core/bg_computer_assembling.py", line 290, in merge_parts
new_rm.add(old_reg)
File "core/region/region_manager.py", line 120, in add
self.add_tocache(id, regions)
File "core/region/region_manager.py", line 138, in add_tocache
self.cache[id] = region
File "libs/cachetools/lru.py", line 21, in setitem
cache_setitem(self, key, value)
File "libs/cachetools/cache.py", line 50, in setitem
raise ValueError('value too large')
ValueError: value too large