chenhcs / scResolve

MIT License
12 stars 0 forks source link

High Memory Usage Leading to System Crash During `convert visium` Command #1

Open jkbenotmane opened 11 months ago

jkbenotmane commented 11 months ago

Environment:

for i, (sample, path) in enumerate(samples.items()):
    print(f"#### {sample} ####")
    print(f"##  Converting {sample} ##")
    os.makedirs(f"{out}/{sample}", exist_ok=True)
    outpath = f"{out}/{sample}"

    # Read Scale Factor
    file_path = f"{path}/spatial/scalefactors_json.json"
    try:
        with open(file_path, 'r') as file:
            data = json.load(file)
        print(data)
    except FileNotFoundError:
        print("File not found. Please check the file path.")
    except json.JSONDecodeError:
        print("File is not a valid JSON format.")
    scale_factor = int(data["spot_diameter_fullres"])

    # Prepare Input for scResolve
    !scresolve convert visium \
    --bc-matrix "{path}/filtered_feature_bc_matrix.h5" \
    --image "{path}/spatial/tissue_lowres_image.png" \
    --tissue-positions "{path}/spatial/tissue_positions_list.csv" \
    --scale-factors "{path}/spatial/scalefactors_json.json" \
    --save-path "{out}/{sample}" \
    --mask-file "{path}/spatial/detected_tissue_image.jpg" \
    --scale {scale_factor}

Issue: The command works but eventually, the script consumes all available RAM, leading to a system crash. This happens for lowres and hires visium images.

Troubleshooting Steps Taken:

Questions/Requests:

  1. Are there any known memory optimization techniques specific to scresolve that I can apply?
  2. Is there a way to reduce the memory footprint of the convert visium command?
  3. how should the scale parameter be used and how can I optimize it for memory usage
  4. Any suggestions would be greatly appreciated.

Thank you for your assistance!

DorisZhan commented 2 days ago

I had the same issue! Did you solve it?