lbartnik / subprocess

Other
49 stars 10 forks source link

process_read hung #57

Open DSXiangLi opened 6 years ago

DSXiangLi commented 6 years ago

Hi Lukasz,

Thanks for this amazing package! It has been of great help!

I recently ran into a problem when calling process_read, see below

 handle <- try( spawn_process(R_command, argument = argument), silent = T)

 message <- process_read(handle,  pipe = PIPE_STDERR,  timeout = 1000 ,flush = TRUE)

if(length(message)!=0){
        cat(message,'\n', file= log_con)
}

Basically I am calling another Rscript that contains a while loop reading from database and printing all the queries. And I want to log all these queries into my log file - log_con.

However whenever I reach about 450 lines of queries, process_read stops returning result. When I check process_state(handle), it shows the script is still running.

The Rscript is working fine independently. Any idea what may cause this?

And is there an easier way to write the process_read into a log file, like python function

subprocess.check_call(command, stdout=log, stderr=log, shell=True, env=rscript_env)

Many thanks!

Sandy

lbartnik commented 6 years ago

Hi Sandy,

I'd love to see the command that produces the output you want to capture. Or some characteristics of it: e.g. how long are the lines? How much data is sent to the output? What is the encoding? In order to answer your questions, I'd have to reproduce the error first.

By the way, you might want to try the processx package (https://github.com/r-lib/processx). It seems a good choice for what you're trying to achieve.

ws171913 commented 6 years ago

Tested on Windows(R3.4) and Ubuntu(R3.4) with included code. subprocess.R writes onto stderr arround 949 bytes each 0.1 sec. master.R reads PIPE_STDERR as in your example and logs output onto console. Sources which I tested with are included. Unfortunately could not reproduce the issue: abount 2k lines logged without hanging on both platforms.

Subprocess_is57_Test.zip