CAnBioNet / TkNA

9 stars 1 forks source link

'File name too long' error? #81

Closed zhuxinyue1998 closed 3 months ago

zhuxinyue1998 commented 3 months ago

Hello, I encountered an error while trying to construct a network using TkNA. I ran the following command in the terminal on my MacBook Pro: python ../reconstruction/run.py --data-source ./test_AB.zip --config-file ./config.json --out-file ./network_output.zip Unfortunately, I received the following error message: Traceback (most recent call last): File "../reconstruction/run.py", line 66, in network_reconstructor.reconstructNetwork(config, dataset, start=args.start, stopStage=args.stop, dataOutFilePath=args.outFile, cores=args.cores) File "/Users/xinyuezhu/Work/Benchmark causal inference method/test_method/TkNA-main/reconstruction/reconstruction/NetworkReconstructorAggregate.py", line 749, in reconstructNetwork return self.runPipeline(stages, allData, *kwargs) File "/Users/xinyuezhu/Work/Benchmark causal inference method/test_method/TkNA-main/reconstruction/reconstruction/NetworkReconstructor.py", line 59, in runPipeline stage(allData) File "/Users/xinyuezhu/Work/Benchmark causal inference method/test_method/TkNA-main/reconstruction/reconstruction/NetworkReconstructorAggregate.py", line 695, in computeCorrelations allData["correlationCoefficients"], allData["correlationPValues"] = calculateCorrelations(config, allData["filteredData"], cores) File "/Users/xinyuezhu/Work/Benchmark causal inference method/test_method/TkNA-main/reconstruction/reconstruction/NetworkReconstructorAggregate.py", line 463, in calculateCorrelations correlationResults = treatmentData.groupby("metatreatment").map(calculateForMetatreatment) File "/Users/xinyuezhu/opt/anaconda3/envs/tkna/lib/python3.8/site-packages/xarray/core/groupby.py", line 795, in map return self._combine(applied, shortcut=shortcut) File "/Users/xinyuezhu/opt/anaconda3/envs/tkna/lib/python3.8/site-packages/xarray/core/groupby.py", line 814, in _combine applied_example, applied = peek_at(applied) File "/Users/xinyuezhu/opt/anaconda3/envs/tkna/lib/python3.8/site-packages/xarray/core/utils.py", line 196, in peek_at peek = next(gen) File "/Users/xinyuezhu/opt/anaconda3/envs/tkna/lib/python3.8/site-packages/xarray/core/groupby.py", line 794, in applied = (maybe_wrap_array(arr, func(arr, args, **kwargs)) for arr in grouped) File "/Users/xinyuezhu/Work/Benchmark causal inference method/test_method/TkNA-main/reconstruction/reconstruction/NetworkReconstructorAggregate.py", line 423, in calculateForMetatreatment treatmentDataSharedMemory = shared_memory.SharedMemory(create=True, size=treatmentData.nbytes, name=sharedMemoryName("treatmentData", runnerId)) File "/Users/xinyuezhu/opt/anaconda3/envs/tkna/lib/python3.8/multiprocessing/shared_memory.py", line 102, in init self._fd = _posixshmem.shm_open( OSError: [Errno 63] File name too long: '/treatmentData_1712824020.879668' (tkna) XinyuedeMacBook-Pro:input xinyuezhu$ /Users/xinyuezhu/opt/anaconda3/envs/tkna/lib/python3.8/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 2 leaked shared_memory objects to clean up at shutdown warnings.warn('resource_tracker: There appear to be %d '

It appears that the error is due to the filename exceeding the maximum length allowed by the system. Could you please advise me on how to address this issue? Any assistance would be greatly appreciated. Thank you!