worldbank / ietoolkit

Stata commands designed for Impact Evaluations in particular, but also data work in general
https://worldbank.github.io/ietoolkit/
MIT License
216 stars 77 forks source link

Stata 17 crashes when using `iebaltab` with 4.3GB dataset #368

Open paulasanematsu opened 2 weeks ago

paulasanematsu commented 2 weeks ago

Hello,

I am a Research Computing Facilitator at FASRC. Raul Duarte reached out to our support because he was running a Stata code with the function iebaltab on our cluster and the job was dying midway through computation. We troubleshot extensively without much progress, so we are reaching out to you for guidance. I will try to summarize the computational environment and what we have done so far.

Unfortunately, because Raul’s data cannot be shared (because of a Data Use Agreement [DUA] signed), we cannot share the data, but we will try to explain as much as possible.

Computational environment

Analysis

Raul wrote a Do file that uses the iebaltab function to analyze a dataset that is 4.3GB:

iebaltab median_hs6_unit_price median_hs6_cifdoldecla median_hs6_imponiblegs unit_price_final cifdoldecla imponiblegs, replace grpvar(val_count_patronage_hire) fixedeffect(port_day_ID) ///
    savetex("$DirOutFasse\baltab_val_shipment_item_values_counter_day.tex") ///
    grplabels(0 "Non-patronage" @ 1 "Patronage")  format(%12.0fc) order(1 0) ///
    rowlabels(median_hs6_unit_price "Median HS6 unit price (in USD)" @ median_hs6_cifdoldecla "Median HS6 CIF value (in USD)" ///
        @ median_hs6_imponiblegs "Median HS6 tax base (in PYG)" @ unit_price_final "Unit price (in USD)" ///
        @ cifdoldecla "Declared CIF value (in USD)" @ imponiblegs "Tax base (in PYG)") nonote

Raul wrote:

This line uses the following command to create a balance table. My dataset is a database of imports and for the balance table tests of difference between two groups (patronage and non-patronage) handling shipment items I want to include port-day fixed effects (and since I have 5 years of data and 31 customs ports), this could lead to more than 56,000 fixed effects included, which seems to be what is leading to problems, as the balance table does run without the fixed effects.

His typical run was on fasse_bigmem (499 GB of RAM and 64 cores).

Troubleshooting steps

  1. On the Stata GUI, Raul tried the following:
    1. To rule out-of-memory errors, he tested the Do-file on our computer with 2000 GB of RAM and 64 cores and still ran into the same problem.
    2. Successfully ran the Do-file with iebaltab on a subset of his original dataset. The subset is a 5% random sample of the original dataset.
    3. Checked that he is not exceeding any of the Stata settings
    4. Set the max_memory to slightly less than the total memory, he set it to 495 GB when the memory requested on fasse_bigmem was 499 GB.
    5. Tried to run with Stata-SE using a single core, but Stata threw an error that it could not handle as many variables with the SE version.
    6. I suggested using debugging mode (https://www.stata.com/support/faqs/programming/debugging-program/), but that has not helped to provide more valuable information about the error
  2. On the command line, I submitted a job via the scheduler to run the same Do-file using the original dataset
    1. While the job was running, I used top to see cpu and memory usage and I also kept checking the disk usage of /tmp with the du command. The core usage was almost at 100% for all 64 cores, memory was at about 5-6% (of 499 GB), and /tmp had about 4-5 GB usage. At about 1h, I could see each process dying and everything stalled.

I am hoping that you have some guidance if Raul possibly ran into a bug or something on our end that we need to change.

Thank you for taking the time to read this. We will be happy to answer any questions.

Best, Paula and Raul

kbjarkefur commented 4 days ago

Wow, you are really putting our code to the test. Fun!

Here are my first reactions to what you have already tested:

Questions:

Suggestions:

Let me know what these comments make you think or what these suggestions teaches you. Happy to keep working with you until this is resolved. However, it might also be related to Stata (especially on Linux) where I would not be able to help with a solution.

paulasanematsu commented 3 hours ago

Glad to hear this is a "fun" problem!

Answering your questions:

Based on Kristoffer thoughts and suggestions, I have a few suggestions for Raul so we can better understand what is happening:

  1. For all runs below, set max_memory to a little lower than the requested memory (e.g. if you request 200G, set max_memory to 190 GB). Although we don't have indications of a memory issue, I think this is safer than no setting at all.

  2. Rerun a do-file that uses a 5% random sample. I would like to test the 1h timeout hypothesis. You ran this before, but I would like to confirm that it ran beyond the 1h limit. If Stata allows, can you printout the date and time before and after the iebaltab function so we know how long that particular function ran? If the 5% runs in less than 1h, then increase the sample subset.

  3. Run the original do-file using the fasse_gpu partition (i.e. GPU-enabled computer). To use these, when you request a Stata session, you have to request the fasse_gpu partition and request 1 in the "Number of GPUs". I am not sure if Stata needs extra settings to run on a GPU or if it works out of the box. You can check if the GPU card is being used by opening a terminal (Applications on the top left corner-> Terminal Emulator). Then execute the command nvtop. If the GPU is being used, you will see a graph with GPU % and GPU mem % being used.

Raul, if you prefer to prepare a do-file for #2, I will be happy to run and observe CPU and memory usage while it runs.

Does Stata have a built-in profiler to show how much time and memory a code uses each function? If yes, it would be worth using a profiler in these additional tests.