Hi Raphael,
I am running the script for the manuscript again with the old version, and it reported error in my new computer; the location is almost the last step.
It is not urgent, and can be bypassed in most cases as I saved the shortest_path_timeserie for all dataset before. Maybe this won't be relevant anymore as I didn't see such a function in your new manual. I am just reporting to let you know!
Attempting to set up 90 localhost parallel workers with only 22 CPU cores available for this R process (per ‘system’), which could result in a 409% load. The hard limit is set to 300%. Overusing the CPUs has negative impact on the current R process, but also on all other processes of yours and others running on the same machine. See help("parallelly.options", package = "parallelly") for how to override the soft and hard limits**
Hi Raphael, I am running the script for the manuscript again with the old version, and it reported error in my new computer; the location is almost the last step. It is not urgent, and can be bypassed in most cases as I saved the
shortest_path_timeserie
for all dataset before. Maybe this won't be relevant anymore as I didn't see such a function in your new manual. I am just reporting to let you know!Convert igraph representation to lat-lon
`> shortest_path_timeserie <- geopressure_ts_path(shortest_path_df, pam$pressure, include_flight = c(0, 1))
**Error in checkNumberOfLocalWorkers(workers) : `
Attempting to set up 90 localhost parallel workers with only 22 CPU cores available for this R process (per ‘system’), which could result in a 409% load. The hard limit is set to 300%. Overusing the CPUs has negative impact on the current R process, but also on all other processes of yours and others running on the same machine. See help("parallelly.options", package = "parallelly") for how to override the soft and hard limits**