BetterStackHQ / logtail-vercel-nextjs-example

7 stars 0 forks source link

is it possible to avoid await while logging to improve api method performance. #1

Open GorvGoyl opened 2 years ago

GorvGoyl commented 2 years ago

In the provided example, await is used to write logs in nextjs api routes and it blocks the execution of rest of code. Situation gets worse if there are multiple logs inside a function. Overall reducing the performance of api. Is there a way to asynchronously write logs so that the method stays performant.

await logger.info(
    'Handling a request to /api/hello',
    { user } // you can log structured data
  );
gyfis commented 2 years ago

@GorvGoyl Thanks for raising the issue!

You only have to use await if you want to wait before the logs are sent. We batch the logs on the background, but usually you don't have to worry about that - you'd usually only want to await the log send at the end of your function (for Lambda, for example). We include the code to make sure the logs are sent in this example.

GorvGoyl commented 2 years ago

If I don't use await then the logs are missing from logtail dashboard.

you'd usually only want to await the log send at the end of your function

I didn't get this. Do you mean I can write logs across the function but they would get sent only at the end of the function call?

gyfis commented 2 years ago

If I don't use await then the logs are missing from logtail dashboard.

you'd usually only want to await the log send at the end of your function

I didn't get this. Do you mean I can write logs across the function but they would get sent only at the end of the function call?

We send logs in the background, so you only need to await for the logs to be sent at the end of your program (if it’s a short-lived one, like Lambda). If you have a node server, you shouldn’t need to await, since the logger isn’t destroyed. Does this help?

GorvGoyl commented 2 years ago

my doubt is specifically for vercel serverless functions. I'm still not sure how to send logs "at the end of the function call". Example given in the repo send logs during the function call and contributes to the total execution time:

export default async function handler(req, res) {
  const user = {
    id: 123,
    name: 'John Doe',
  };

  await logger.info(
    'Handling a request to /api/hello',
    { user } // you can log structured data
  );

  res.status(200).json({ name: user.name })
}

and if I remove the await then the logs are missing from dashboard.

gyfis commented 2 years ago

@GorvGoyl I believe this is the expected behaviour. When the serverless functions exit/close, the runtime quits and no logs can be sent - so you need to wait for the client to send the logs, and that contribures to the function duration.

We’re open to advise on how to rewrite the js client to not increase the duration of the function while ensuring logs are sent 🙌

gyfis commented 2 years ago

@GorvGoyl We also have an integration specifically for Vercel, which may be able to overcome this behaviour: https://vercel.com/integrations/logtail I believe you can use this instead of the node client, for better results.

GorvGoyl commented 2 years ago

We're using vercel integration and logs generated from console.log() are shown in the dashboard. It's just that json objects inside console.log are coming as stringify instead of json structure but I think that's the inherent limitation of console.log.

nadilas commented 2 years ago

Do you mean I can write logs across the function but they would get sent only at the end of the function call?

@GorvGoyl I think @gyfis was referring to a sort of flush implementation. Here's an untested idea for that:

import { Node as Logtail } from '@logtail/js';

const logger = new Logtail(process.env.LOGTAIL_SOURCE_TOKEN);

export default async function handler(req, res) {
  const user = {
    id: 123,
    name: 'John Doe',
  };

  logger.info(
    'Handling a request to /api/hello',
    { user } // you can log structured data
  );

  res.status(200).json({ name: user.name })

  await flush(1000)
}

const flush = async (timeout) => {
  await condition(() => logger.logged() === logger.synced(), timeout)
}

const sleep = (timeout) => {
  return new Promise((resolve) => {
    setTimeout(resolve, timeout)
  })
}

const condition = (fn, timeout) => {
  if (timeout) {
    return await Promise.race([sleep(timeout).then(() => false), conditionInner(fn).then(() => true)]);
  }
  return conditionInner(fn);
}

const tryUnblockCondition = (fn, resolve) => {
  // Eager evaluation
  if (fn()) {
    resolve();
    return;
  }

  setTimeout(() => {
    tryUnblockCondition(fn, resolve)
  }, 5)
}

const conditionInner = async (fn) => {
  return new Promise((resolve) => {
    tryUnblockCondition(fn, resolve)
  });
}
lumenwrites commented 2 years ago

Hey guys, is there any solution to this?

I'd really love to use logtail, but this issue is preventing me from that.

What I need is to be able to send structured logs to logtail.

Vercel integration doesn't work because all logs arrive as strings, not JSON objects.

And @logtail/js doesn't work because it dramatically slows down the app performance. If I have 10 log statements in my serverless function (API handler or getServerSideProps), and awaiting each of them takes like half a second, it'll slow down the response by like 5 seconds, which makes it unusable.

I really love Logtail's UI, and I'd love to use it, but I'd need some sort of integration that works.

gyfis commented 2 years ago

@lumenwrites Hi! What you could do, with @logtail/js, instead of awaiting on all log sends, you can await on the last one only. You need to await at least one to ensure they get sent (or not, if you're okay with losing the logs), but adding a single await shouldn't slow the function down as much.

lumenwrites commented 2 years ago

@gyfis Thank you, this is helpful!

Do you know if it's possible to automate that or abstract that away somehow?

Seems like it would me much too easy to forget to add await to the last log statement, and too difficult to explain to coworkers why it has to be this way. Doesn't seem like the best coding practice.

gyfis commented 2 years ago

@lumenwrites Great points, thanks for raising them. I suppose wrapping a function (the main one, for instance) in a different function that calls the await at the end would work well enough.

I’m forwarding this to the team to look into abstracting this away at the library layer. The issue is that some executors (notably AWS Lambda) make it challenging to figure out that the end of the runtime is near.

lumenwrites commented 2 years ago

@gyfis Thanks, that would be amazing! I don't quite see how that would work with Vercel serverless functions, but if you guys could make a solution that "just works", it would be fantastic!

An alternative solution would be to update your Vercel integration to be able to accept messages from vercel as JSON, not as strings. Maybe parse them automatically when you see that it's JSON?

That way we could just write json into stdout (with something like Pino, or even just console.log maybe), and then your integration would send it to logtail as JSON.

In that case we wouldn't use this library, we'd just print json formatted logs in any way.

gyfis commented 2 years ago

@lumenwrites I believe the JSON parsing should work 🤔 do you mind sending me a sample setup/logs to tomas@betterstack.com of where this Vervel <> JSON doesn’t work as expected? If you don’t have one, I’ll try to reproduce myself. Thanks a lot!

lumenwrites commented 2 years ago

@gyfis

Also, you guys could use next-axiom as an example: https://github.com/axiomhq/next-axiom/blob/main/src/withAxiom.ts

It's a platform similar to Logtail, and they have an integration that is very easy to plug in and "just works". I'd much prefer to use Logtail, because I absolutely love your UI/UX. But writing and maintaining an integration like next-axiom myself would be much too time consuming and complicated. I'd love to use yours though, if you had one.

do you mind sending me a sample setup/logs to tomas@betterstack.com of where this Vervel <> JSON doesn’t work as expected?

I think you'd just need to deploy any simple next.js project, install a Logtail integration, and then try logging out json like this:

const error = {
  message: 'Error message',
  data: {
    name: 'John',
    price: 123,
  }
}
console.log(error)

Or try it with Pino (a popular structured logging solution):

const logger = require('pino')()
logger.info(error)

In both cases you will see that json arrives into Logtail as a string. If you could parse it and turn it into JSON, that'd be amazing.

gyfis commented 2 years ago

Thanks a lot for the detailed info!

WesleyYue commented 1 year ago

It's a platform similar to Logtail, and they have an integration that is very easy to plug in and "just works". I'd much prefer to use Logtail, because I absolutely love your UI/UX. But writing and maintaining an integration like next-axiom myself would be much too time consuming and complicated. I'd love to use yours though, if you had one.

FWIW, I also tried out logtail first and preferred the UI but am using Axiom now for this reason. The vercel integration here doesn't really work because it sends all the console output as one log at the end of the execution, which not only makes it not possible to query but also gets truncated if there is a lot of output in one function run due to the 4kb limit. There was also no Pino integration which I found surprising because betterstack.com has an extremely detailed Pino guide.

curusarn commented 1 year ago

Hi everyone, we have added logtail.flush()! 🎉

Call logtail.flush() at the end of your serverless function to send any currently batched logs to Logtail. Flush will send the logs without waiting for the batching interval.

Thank you for your feedback. We really appreciate it. Let me know if you have any issues. 🙏

We are also actively looking into adding Logtail Next.js integration, where the flushing would be done automatically.

curusarn commented 1 year ago

@WesleyYue

There was also no Pino integration which I found surprising because betterstack.com has an extremely detailed Pino guide.

We have recently added Logtail Pino transport. Check out our Quick start guide.