Closed ha-nguyen closed 8 years ago
Can you share the code you used and what indication you have it failed? In case it matters, the following works for me: backend.parameters = list(hadoop = list(D = "mapred.map.tasks=4", D = "mapred.reduce.tasks=4")) as the backend.parameters parameter list.
According to the log files, these parameters are depracated and should be changed to mapreduce.job.maps and mapreduce.job.reduces, respectively, but for now they work as they should (with warnings).
On Mon, Feb 22, 2016 at 5:29 PM, ha-nguyen notifications@github.com wrote:
I am trying change the number of map / reduce tasks in R environment but non of these parameter worked: "mapred.map.tasks" and "mapred.map.task.maximum" "mapred.tasktracker.map.tasks.maximum", "mapred.tasktracker.reduce.tasks.maximum",
I also looked up for help in RHdoop setting but they specify only 4 parameters : mapreduce.map.java.opts, mapreduce.reduce.java.opts, mapreduce.map.memory.mb, mapreduce.reduce.memory.mb
Is there any way to change number of mapper/reducer in R environment without intervention into Hadoop configuration parameter file ?
Thanks, Ha.
— Reply to this email directly or view it on GitHub https://github.com/RevolutionAnalytics/rmr2/issues/176.
Thanks a lot ! It worked for me now !
I am trying change the number of map / reduce tasks in R environment but non of these parameter worked: "mapred.map.tasks" and "mapred.map.task.maximum" "mapred.tasktracker.map.tasks.maximum", "mapred.tasktracker.reduce.tasks.maximum",
I also looked up for help in RHdoop setting but they specify only 4 parameters : mapreduce.map.java.opts, mapreduce.reduce.java.opts, mapreduce.map.memory.mb, mapreduce.reduce.memory.mb
Is there any way to change number of mapper/reducer in R environment without intervention into Hadoop configuration parameter file ?
Thanks, Ha.