Open pilarOrtega opened 4 years ago
Maybe, this issue is related to a deeper problem in the wrapped hmetis
.
Reference: http://glaros.dtc.umn.edu/gkhome/node/555
Hi,
Any update on this?
I'm having the same error:
`INFO: Cluster_Ensembles: cluster_ensembles: due to a rather large number of cells in your data-set, using only 'HyperGraph Partitioning Algorithm' (HGPA) and 'Meta-CLustering Algorithm' (MCLA) as ensemble consensus functions.
INFO: Cluster_Ensembles: HGPA: consensus clustering using HGPA.
# INFO: Cluster_Ensembles: wgraph: writing wgraph_HGPA. INFO: Cluster_Ensembles: wgraph: 64612 vertices and 64 non-zero hyper-edges. #
#
INFO: Cluster_Ensembles: sgraph: calling shmetis for hypergraph partitioning.
Out of netind memory!
Traceback (most recent call last):
File "consensus_clustering.py", line 105, in
Hi!
I am having this issue when trying to run Cluster_Ensembles in a CentOs machine. I have already installed metis and apparently is running.
Do you know what can be giving the error?