loosolab / TF-COMB

Transcription Factor Co-Occurrence using Market Basket analysis
https://tf-comb.readthedocs.io
MIT License
10 stars 1 forks source link

Error at market_basket step #77

Open ancient-learner opened 9 months ago

ancient-learner commented 9 months ago

Hi I am getting the following error while running the market_basket function.


>>> C.TFBS_from_motifs(regions="./Peaks_hg19.bed", ````
motifs="./HOCOMOCOv11_HUMAN_motifs.txt",genome="./hg19.fa.gz",threads=12)
INFO: Scanning for TFBS with 12 thread(s)...
INFO: Progress: 10%
INFO: Progress: 20%
INFO: Progress: 30%
INFO: Progress: 40%
INFO: Progress: 51%
INFO: Progress: 60%
INFO: Progress: 70%
INFO: Progress: 81%
INFO: Progress: 95%
INFO: Finished!
INFO: Processing scanned TFBS
INFO: Identified 1181278 TFBS (401 unique names) within given regions
INFO: The attribute .TFBS now contains 1306796 TFBS (401 unique names)
>>> C.market_basket(threads=10)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/ay328/.conda/envs/tfcomb_env/lib/python3.10/site-packages/tfcomb/objects.py", line 1129, in market_basket
    table = pd.melt(pair_counts_table, id_vars=["TF1"], var_name=["TF2"], value_name="TF1_TF2_count")  #long format (TF1, TF2, value)
  File "/home/ay328/.conda/envs/tfcomb_env/lib/python3.10/site-packages/pandas/core/reshape/melt.py", line 100, in melt
    raise ValueError(f"{var_name=} must be a scalar.")
ValueError: var_name=['TF2'] must be a scalar.

Pls let me know if I am doing something wrong

hschult commented 8 months ago

Hi! thanks for reporting this error. Investigation showed that this is a bug connected to the installed pandas version. We are currently working on a fix. Until this is done please downgrade your pandas version to 2.0.0 as a workaround.

ancient-learner commented 8 months ago

Hi! thanks for reporting this error. Investigation showed that this is a bug connected to the installed pandas version. We are currently working on a fix. Until this is done please downgrade your pandas version to 2.0.0 as a workaround.

Thanks very much! It worked after downgrading the pandas version :)