quinngroup / dr1dl-pyspark

Dictionary Learning in PySpark
Apache License 2.0
1 stars 1 forks source link

Complexity analysis #55

Closed magsol closed 8 years ago

magsol commented 8 years ago

We need a more thorough and complete complexity analysis of our algorithm, built on the back-of-the-envelope calculations we did earlier in my office. In particular, we need analysis of

XiangLi-Shaun commented 8 years ago

Shannon, do you happen to know a easy way in pyspark to get the memory load (average and/or max) in each worker? For example, to analyze the time cost I can simply print datetime.datetime.now(), is there a similar way for the memory?

Otherwise I'll look into the log files from pyspark, which is messy as it records the information from every function call. Thanks.

magsol commented 8 years ago

It should be available via the master nice job tracker. Available at http://:8080

iPhone'd

On Feb 5, 2016, at 17:27, LindberghLi notifications@github.com wrote:

Shannon, do you happen to know a easy way in pyspark to get the memory load (average and/or max) in each worker? For example, to analyze the time cost I can simply print datetime.datetime.now(), is there a similar way for the memory?

Otherwise I'll look into the log files from pyspark, which is messy as it records the information from every function call. Thanks.

— Reply to this email directly or view it on GitHub.