Closed pcdjohnson closed 8 years ago
Hi there, sorry for the late reply. This was due to long time intervals generated by the initial random tree. Fixed as of 83b4d2f7d4abe420eef836d9a6ce43a89c1ee030 Behaviour of the function is now a bit more clever. The generation time distribution is automatically completed if needed by an exponential tail summing to a low value (1e-4) to cover the temporal range, so that even initial star trees should not return a -Inf temporal log likelihood. Your example now runs for the first few hundred iterations. I'll wait for you to confirm to close the issue.
Great, thanks very much -- that seems to have fixed the problem. I haven't had the patience to get the simple simulated example above to complete over 1e5 iterations, but a short run using find.import.n = 5, sample.every = 50, n.iter = 1e4, burnin = 5e3
did work. I also ran it on the the real data (~2000 real cases with complete simulated sequence data + spatial data) that generated the error in the first place. It worked very nicely.
Hi,
While running outbreaker on a large data set of 2026 cases with complete sequence data, I got this type of error:
Error in outbreaker( ... [in: structures.c->gentime_dens] Trying to get density for 150 time units (max: 150). Exiting.
I reproduced this error in a simple analysis of 2000 simulated dates with no sequence data. The console output from the full session is pasted below, followed by the sessionInfo() output.Thanks for any help, Paul
PS this issue was originally posted here: https://groups.google.com/forum/?hl=en#!topic/r-epi/l4lOnp-eUKk