When using the SAVE XSPEC command in a fork-based grid run, the resulting FITS file will contain many blocks of only zeros. This is because the results are not communicated between the various forked instances of the code. MPI-based and sequential runs are fine. MPI-based runs use MPI_Reduce() to communicate the results. There is no equivalent for fork-based runs.
{
"status": "closed",
"changetime": "2019-02-04T12:09:37Z",
"_ts": "1549282177514312",
"description": "When using the SAVE XSPEC command in a fork-based grid run, the resulting FITS file will contain many blocks of only zeros. This is because the results are not communicated between the various forked instances of the code. MPI-based and sequential runs are fine. MPI-based runs use MPI_Reduce() to communicate the results. There is no equivalent for fork-based runs.",
"reporter": "peter",
"cc": "",
"resolution": "fixed",
"time": "2017-08-25T07:48:06Z",
"component": "output",
"summary": "Output from SAVE XSPEC command is faulty in fork-based runs",
"priority": "major",
"keywords": "",
"version": "trunk",
"milestone": "c17.01",
"owner": "peter",
"type": "defect - wrong answer"
}
reported by: peter
When using the SAVE XSPEC command in a fork-based grid run, the resulting FITS file will contain many blocks of only zeros. This is because the results are not communicated between the various forked instances of the code. MPI-based and sequential runs are fine. MPI-based runs use MPI_Reduce() to communicate the results. There is no equivalent for fork-based runs.
Migrated from https://www.nublado.org/ticket/399