Closed clarkfitzg closed 7 years ago
On Mon, Apr 03, 2017 at 04:34:57PM -0700, Clark Fitzgerald wrote:
I just gave bigmemory a shot, thanks for the suggestion. Performance is far better than swap. I managed to use all remaining disk space on my machine by creating 350 GB of temporary files through the course of the experiment.
Glad to hear this.
Today I learned the files in /tmp don't get automatically removed until a reboot :)
Yes, it's a real problem for our students in CS, as the undergrade computer lab machines are automatically rebooted every night.
Re NFS, that seems like a complex topic. What you described sounds similar to this: http://stackoverflow.com/a/10369193/2681019
Yes, this is very relevant to my problem.
Do you mean that partools could be using bigmemory?
Yes, as a supplement. Distributing a data frame would be much faster using shared memory, if the data does not need to be sorted.
@matloff you'll need to login to allow Travis to run on the main repository. Then we can link to test results like these: https://travis-ci.org/clarkfitzg/partools
You can view, comment on, or merge this pull request online at:
I merged it, and (I think) set permission to run Travis. Please check.
Norm
Beginning to set up the test infrastructure.
@matloff you'll need to login to allow Travis to run on the main repository. Then we can link to test results like these: https://travis-ci.org/clarkfitzg/partools