BaselAbujamous / clust

Automatic and optimised consensus clustering of one or more heterogeneous datasets
Other
160 stars 35 forks source link

Issue with --no-fil-flat #80

Open lucasinchausti opened 1 year ago

lucasinchausti commented 1 year ago

Hello,

I'm trying clust but it seems I can't get de --no-fil-flat option for avoid gene filtering during pre-processing steps.

This is my line of code: $ clust input_file.txt --no-fil-flat -n0 -t 2 -cs 50 -o clust_7

Note: my input file is already normalized and consists of 10338 genes

And this is the output: /===========================================================================\ Clust (Optimised consensus clustering of multiple heterogenous datasets) Python package version 1.17.0 (2022) Basel Abu-Jamous +---------------------------------------------------------------------------+ Analysis started at: Thursday 27 April 2023 (19:09:25) 1. Reading dataset(s) 2. Data pre-processing 3. Seed clusters production (the Bi-CoPaM method) 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 4. Cluster evaluation and selection (the M-N scatter plots technique) 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 5. Cluster optimisation and completion 6. Saving results in clust_7 +---------------------------------------------------------------------------+ Analysis finished at: Thursday 27 April 2023 (19:14:08) Total time consumed: 0 hours, 4 minutes, and 42 seconds

\===========================================================================/

/===========================================================================\ | RESULTS SUMMARY | +---------------------------------------------------------------------------+ | Clust received 1 dataset with 10338 unique genes. After filtering, 10338 | | genes made it to the clustering step. Clust generated 3 clusters of | | genes, which in total include 3463 genes. The smallest cluster includes | | 357 genes, the largest cluster includes 2575 genes, and the average | | cluster size is 1154 genes. | +---------------------------------------------------------------------------+

Thanks!

Lucas