Open mjarosie opened 8 months ago
I am unable to reproduce the problem with the script:
const pino = require("pino");
const transport = pino.transport({
targets: [
{
target: "pino/file",
options: {
destination: 1, // stdout
},
},
{
target: "pino/file",
options: {
destination: "./test.log",
mkdir: true,
// Neither combination of the two options below makes any difference.
sync: true,
fsync: true,
},
},
],
});
const logger = pino(transport);
let run = true
setTimeout(() => { run = false }, 60_000)
let i = 0
while (run === true) {
logger.info({ i }, 'test %i', i)
i += 1
}
In one terminal run node index.js
. In a second, run tail -f test.log
. The log file is tailed successfully.
What version of Node.js are you using?
I'm trying to write logs from an application running on a host machine to a Docker container via mounted volume. Using
pino/file
transport that targets a file that is in a directory mounted to a Docker container (viadocker-compose
) does not propagate the file changes to the container.Given the following options that log to both stdout and a file:
And the following compose config:
The container does not receive logs as they're written into the log file. The container only sees the logs lines once the node application gets closed.
Is there any way of "forcing"
pino/file
transport to somehow "flush" those changes to the file with each new log line?Initially I'd though it's a Docker issue, but then I have also noticed that the same happens with my VSCode editor - if I have the log file opened and the application runs, new logs do not appear, but once I close the application - all new logs suddenly appear in the editor.
Here's the repo that reproduces the issue: https://github.com/mjarosie/pino-file-docker-volume-issue