Closed budarin closed 2 years ago
Typically it is not recommended to test the logs unless the log is a feature, thus logLevel: 'silent'
is common for testing. However, you can provide a destination stream and inspect logs through that https://github.com/pinojs/pino/issues/1274#issuecomment-997971329
and what other testing options are there in this case if it is not desirable to use logs? Its microservice and logs in it are a mandatory part of the service
https://github.com/pinojs/pino/issues/1274#issuecomment-997971329 is tap specific how to intercept pino output with jest?
pinojs/pino#1274 (comment) is tap specific how to intercept pino output with jest?
It is not tap specific, it just uses tap. I assume it is easily adaptable to any useful Node.js testing framework.
Unfortunately I'm not strong enough in either just or pino to do this :(
What testing looks like now
pinojs/pino#1274 (comment) is tap specific how to intercept pino output with jest?
It is not tap specific, it just uses tap. I assume it is easily adaptable to any useful Node.js testing framework.
As far as I understand from the example given, there is a dest output stream in tap in context, but there is no such thing in jest It is not clear where to connect split in jest
As far as I understand from the example given, there is a dest output stream in tap in context, but there is no such thing in jest It is not clear where to connect split in jest
No. A stream is created for each subtest and then that stream is passed to the logger in each test as the destination stream.
It is unclear how to get to the logger that is initialized in the service and pass this stream to it
In tap , as you show in the code, it is solved elegantly, but in other test runners - it is unclear how to implement it
The example shows a logger being tested. You must adapt the concept to apply to your application.
To be honest, I can't even imagine how once you create a pino logger, you can change it on the go with each destination test (
To be honest, I can't even imagine how once you create a pino logger, you can change it on the go with each destination test (
You can not change the destination stream after the logger created.
The same applied to tap
and all test framework.
Your test should be made in self-contained. That means each test should provide it's own copy of fastify
.
This way, you can have separate logger for each test, you can start each test as per your needs and you can run the test in parallel.
Trying to use Writeable Streams to implement the idea, but it fails :(
const logLines = [];
class LogToArrayStream extends Writable {
_write(chunk, encoding, done) {
logLines.push(JSON.parse(chunk.toString()));
done();
}
}
const testLogger = pino({ level: 'trace' },
pino.destination(new LogToArrayStream()),
);
testLogger.info('Hello');
console.log('logLines', logLines);
PS: using write
instead of _write
does not change the result
In console:
{"level":30,"time":1653043367560,"pid":20832,"hostname":"WIN-40K96OG4L3A","msg":"Hello"}
logLines []
What is wrong?
Trying to use Writeable Streams to implement the idea, but it fails :(
https://github.com/pinojs/pino/blob/master/docs/api.md#pinodestinationopts--sonicboom Because you do not follow how the document does.
pino.destination
do not receive stream
as parameter.
IDE says it:
Finally solved the problem! :)
import pino from 'pino';
import { Writable } from 'node:stream';
const logLines = [] as string[];
class LogToArrayStream extends Writable {
_write(chunk, encoding, done) {
logLines.push(JSON.parse(chunk.toString()));
done();
}
}
const runtimeLooggger = pino({ level: 'trace' }, pino.multistream([{ stream: new LogToArrayStream() }]));
beforeAll(async () => {
// pass runtimeLooggger to Fastify factory
app = await createApp(runtimeLooggger);
await app.ready();
});
afterAll(async () => {
await app.close();
});
beforeEach(() => {
logLines.length = 0;
});
describe('Some tests', () => {
test('Test1', async () => {
// check logLines is empty
expect(logLines).toHaveLength(0);
const response = await app.inject({
....
});
expect(response.statusCode).toBe(200);
// use log lines
expect(logLines[0].msg).toBe('Some message')
});
});
@mcollina There is a need to re-export pino from fatify in order not to load a duplicate dependency in order to create a custom logger for tests and for the service
There is a need to re-export pino from fatify in order not to load a duplicate dependency in order to create a custom logger for tests and for the service
Why don't you pass the writable though logger
option?
The task is as simple as
import Fastify from "fastify";
import { Writable } from "stream";
const stream = new Writable({
write(chunk) {
console.log(chunk.toString());
},
});
const fastify = Fastify({ logger: { stream: stream } });
This is the solution stated by @jsumners and you complicated the thing.
Why don't you pass the writable though
logger
option? The task is as simple asimport Fastify from "fastify"; import { Writable } from "stream"; const stream = new Writable({ write(chunk) { console.log(chunk.toString()); }, }); const fastify = Fastify({ logger: { stream: stream } });
This is the solution stated by @jsumners and you complicated the thing.
I want to configure other parameters as well
I want to configure other parameters as well
It do not block you to configure other staff.
Most of the time I would agree to re-export staff, but not for debug / test propose only.
Most of the time I would agree to re-export staff, but not for debug / test propose only.
for create a custom pino Logger instance?
for create a custom pino Logger?
Then, you should really install it as denpendency
since it is part of your application logic.
But this dependency is already in fastify! Why increase the loading and testing time, monitor version compatibility, and increase the volume of node_modules if you can just take the re-exported dependency directly from fastify?
Why don't you pass the writable though
logger
option? The task is as simple asimport Fastify from "fastify"; import { Writable } from "stream"; const stream = new Writable({ write(chunk) { console.log(chunk.toString()); }, }); const fastify = Fastify({ logger: { stream: stream } });
This is the solution stated by @jsumners and you complicated the thing.
by the way, this code does not work (
You have stated that the data in the logs are a feature of your application that you want to guarantee through testing. Therefore, you should treat it as such and supply your own logger instance to Fastify.
by the way, this code does not work (
This code does work, but it may not fulfill what you need or you do something wrongly.
Not re-export keep thing simple.
Have you benchmark how importing a dependency yourself will highly impact the test load time? AFAIK, when you import it once. You will get it from cache.
You have stated that the data in the logs are a feature of your application that you want to guarantee through testing. Therefore, you should treat it as such and supply your own logger instance to Fastify.
Is there an option in fastify to configure destination
option to set minLength
and sync:
options?
This would be the best option if it would be possible to configure pino logger for every taste using only the fastify options - then there would be no need to install an additional copy of pino!
This would be the best option if it would be possible to configure pino logger for every taste using only the fastify options - then there would be no need to install an additional copy of pino!
Yes, you can do it already.
logger.stream
pass to the second parameter for pino creation.
logger.file
create pino.destination
and pass to pino.
Others, remain the same and passed to pino directly.
Yes, you can do it already.
logger.stream
pass to the second parameter for pino creation.logger.file
createpino.destination
and pass to pino. Others, remain the same and passed to pino directly.
again it requires pino as explicit dependency (
Yes, you can do it already.
logger.stream
pass to the second parameter for pino creation.logger.file
createpino.destination
and pass to pino. Others, remain the same and passed to pino directly.again it requires pino as explicit dependency (
I think nothing more I can helps in here.
Just to repeat once again. How you pass to pino(<options>, <stream>)
You can do the same with fastify
options only.
The below examples do the same job.
import Fastify from "fastify";
import pino from "pino";
import { Writable } from "stream";
class CustomStream extends Writable {
lines = [];
_write(chunk) {
this.lines.push(chunk.toString());
}
}
{
// with import pino
const stream = new CustomStream();
const fastify = Fastify({
logger: pino({ level: "trace" }, pino.multistream([{ stream }])),
});
await fastify.inject("/");
console.log(stream.lines);
}
{
// with import pino without pino.multistream
const stream = new CustomStream();
const fastify = Fastify({
logger: pino({ level: "trace" }, stream),
});
await fastify.inject("/");
console.log(stream.lines);
}
{
// without import pino
const stream = new CustomStream();
const fastify = Fastify({
logger: { level: "trace", stream },
});
await fastify.inject("/");
console.log(stream.lines);
}
@climba03003 In general, the problem in this issue has been solved
It would be great if this was documented somewhere or posted in examples.
Thank you for your active participation and help!
{ // with import pino without pino.multistream const stream = new CustomStream(); const fastify = Fastify({ logger: pino({ level: "trace" }, stream), });
await fastify.inject("/"); console.log(stream.lines); }
{ // without import pino const stream = new CustomStream(); const fastify = Fastify({ logger: { level: "trace", stream }, });
await fastify.inject("/"); console.log(stream.lines); }
Unfortunately none of the options where a Readable Stream is passed simply as a stream work (
Is it related to pino 6 or 7 version?
By using pino 6 it works
Is it related to pino 6 or 7 version?
By using pino 6 it works
It should works in both pino@6 (fastify@3)
and pino@7 (fastify@next)
.
I would say, it (maybe) his application logic make it impossible. From the conversation above, no related code is provided. I can give advice from what it can be done.
Our goal was to see 'pino-pretty' output in tests. Here is how we achieved it:
First we added custom file to setupFilesAfterEnv
in jest.config.mjs
// File: jest.config.mjs
const config = {
// ...
setupFilesAfterEnv: ['./jest-fastify-env.mjs'],
// ...
};
export default config;
Then we configured global instance of the Fastify app and the Logger.
// File: jest-fastify-env.mjs
import builder from './app.mjs'; // App builder which builds Fastify instance
import { Writable } from 'stream';
const stream = new Writable({
write(chunk) {
console.log(chunk.toString());
},
});
beforeAll(async () => {
global.app = builder({
logger: {
stream: stream,
level: 'trace',
transport: {
target: 'pino-pretty',
options: {
translateTime: 'HH:MM:ss Z',
ignore: 'pid,hostname,reqId',
},
},
},
});
await global.app.ready();
});
afterAll(async () => {
await global.app.close();
});
Example output:
$ node 'node_modules/.bin/jest' 'app.test.mjs'
PASS ./app.test.mjs
✓ requests the "/" route (77 ms)
Test Suites: 1 passed, 1 total
Tests: 1 passed, 1 total
Snapshots: 0 total
Time: 1.573 s, estimated 2 s
Ran all test suites matching /app.test.mjs/i.
[20:20:33 UTC] INFO: incoming request
req: {
"method": "GET",
"url": "/",
"hostname": "localhost:80",
"remoteAddress": "127.0.0.1"
}
[20:20:33 UTC] INFO: Here is our custom log message.
[20:20:33 UTC] INFO: request completed
res: {
"statusCode": 200
}
responseTime: 59.705299999564886
Very verbose output ;) But this was a desired behaviour.
A simpler solution:
import pino from 'pino'
describe('Logger', () => {
it('should print log lines', () => {
const logs = new Array<string>()
const dest = {
write: (data: string) => logs.push(data),
}
const logger = pino({}, dest)
logger.info('test')
expect(logs).toHaveLength(1)
})
})
Wanted to note that @maucrvlh's answer worked out of the box for multiple messages, but if you're going to use a stream.Writable
, I had to call the callback (3rd argument to _write
), otherwise pino would only send the 1st message through.
Something like this worked when using a Writable
:
export class TestStream extends stream.Writable {
messages: Record<string, unknown>[] = [];
_write(chunk: any, _encoding: BufferEncoding, cb: (error?: Error | null) => void) {
this.messages.push(JSON.parse(chunk));
cb(null); // indicate we're done and no errors, otherwise we'll only see the first msg
}
}
But I went with something based on @maucrvlh's approach, because it was simpler:
export class TestStream implements pino.DestinationStream {
messages: Record<string, unknown>[] = [];
write(msg: string) {
this.messages.push(JSON.parse(msg));
}
}
💬 Question here
In many cases, my service generates an error with the standard text Server Error
When testing, I would like to know the reason for this error, and it is usually displayed in the logs of the service. How do I get access to the service logs during testing?
And along the way, another question-problem: during testing, the entire log log falls out into stdOutput along with the results of jest How to remove logs during testing from std output?
Fastify: next