GGiecold-zz / Cluster_Ensembles

A package for combining multiple partitions into a consolidated clustering. The combinatorial optimization problem of obtaining such a consensus clustering is reformulated in terms of approximation algorithms for graph or hyper-graph partitioning.
MIT License
69 stars 43 forks source link

FileNotFoundError: [Errno 2] No such file or directory: 'wgraph_HGPA.part.16' #19

Open pilarOrtega opened 4 years ago

pilarOrtega commented 4 years ago

Hi!

I am having this issue when trying to run Cluster_Ensembles in a CentOs machine. I have already installed metis and apparently is running.

Do you know what can be giving the error?

INFO: Cluster_Ensembles: cluster_ensembles: due to a rather large number of cells in your data-set, using only 'HyperGraph Partitioning Algorithm' (HGPA) and 'Meta-CLustering Algorithm' (MCLA) as ensemble consensus functions.

*****
INFO: Cluster_Ensembles: HGPA: consensus clustering using HGPA.

#
INFO: Cluster_Ensembles: wgraph: writing wgraph_HGPA.
INFO: Cluster_Ensembles: wgraph: 239847 vertices and 119 non-zero hyper-edges.
#

#
INFO: Cluster_Ensembles: sgraph: calling shmetis for hypergraph partitioning.
Out of netind memory!
Traceback (most recent call last):
  File "cluster_ensemble.py", line 42, in <module>
    clusterlist = cooperative_cluster(data, feature_method)
  File "cluster_ensemble.py", line 22, in cooperative_cluster
    consensus_labels = CE.cluster_ensembles(cluster_runs, verbose = True, N_clusters_max = 16)
  File "/home/DeepLearning/Pyenv/ontoenv/lib/python3.6/site-packages/Cluster_Ensembles/Cluster_Ensembles.py", line 309, in cluster_ensembles
    cluster_ensemble.append(consensus_functions[i](hdf5_file_name, cluster_runs, verbose, N_clusters_max))
  File "/home/DeepLearning/Pyenv/ontoenv/lib/python3.6/site-packages/Cluster_Ensembles/Cluster_Ensembles.py", line 657, in HGPA
    return hmetis(hdf5_file_name, N_clusters_max)
  File "/home/DeepLearning/Pyenv/ontoenv/lib/python3.6/site-packages/Cluster_Ensembles/Cluster_Ensembles.py", line 982, in hmetis
    labels = sgraph(N_clusters_max, file_name)
  File "/home/DeepLearning/Pyenv/ontoenv/lib/python3.6/site-packages/Cluster_Ensembles/Cluster_Ensembles.py", line 1210, in sgraph
    with open(out_name, 'r') as file:
FileNotFoundError: [Errno 2] No such file or directory: 'wgraph_HGPA.part.16'
ghost commented 3 years ago

Maybe, this issue is related to a deeper problem in the wrapped hmetis.

Reference: http://glaros.dtc.umn.edu/gkhome/node/555

anacoelho commented 3 years ago

Hi,

Any update on this?

I'm having the same error:

`INFO: Cluster_Ensembles: cluster_ensembles: due to a rather large number of cells in your data-set, using only 'HyperGraph Partitioning Algorithm' (HGPA) and 'Meta-CLustering Algorithm' (MCLA) as ensemble consensus functions.


INFO: Cluster_Ensembles: HGPA: consensus clustering using HGPA.

# INFO: Cluster_Ensembles: wgraph: writing wgraph_HGPA. INFO: Cluster_Ensembles: wgraph: 64612 vertices and 64 non-zero hyper-edges. #

# INFO: Cluster_Ensembles: sgraph: calling shmetis for hypergraph partitioning. Out of netind memory! Traceback (most recent call last): File "consensus_clustering.py", line 105, in roi_labels=ensemble_clustering(working_dir,subjects_filepath,metric,id_roi,k,atlas_name) File "consensus_clustering.py", line 88, in ensemble_clustering ensemble_labels = CE.cluster_ensembles(cluster_mat,verbose=True,N_clusters_max=nr_cl) File "/home/neuroimaging/.local/lib/python3.8/site-packages/Cluster_Ensembles/Cluster_Ensembles.py", line 309, in cluster_ensembles cluster_ensemble.append(consensus_functions[i](hdf5_file_name, cluster_runs, verbose, N_clusters_max)) File "/home/neuroimaging/.local/lib/python3.8/site-packages/Cluster_Ensembles/Cluster_Ensembles.py", line 657, in HGPA return hmetis(hdf5_file_name, N_clusters_max) File "/home/neuroimaging/.local/lib/python3.8/site-packages/Cluster_Ensembles/Cluster_Ensembles.py", line 982, in hmetis labels = sgraph(N_clusters_max, file_name) File "/home/neuroimaging/.local/lib/python3.8/site-packages/Cluster_Ensembles/Cluster_Ensembles.py", line 1210, in sgraph with open(out_name, 'r') as file: FileNotFoundError: [Errno 2] No such file or directory: 'wgraph_HGPA.part.2`