open-sauced / app

🍕 Insights into your entire open source ecosystem.
https://pizza.new
Apache License 2.0
419 stars 224 forks source link

Bug: SSR Lambdas on Netlify frequently timeout #3066

Closed jpmcb closed 6 months ago

jpmcb commented 7 months ago

Describe the bug

The SSR lambdas on Netlify frequently time out:

Screenshot 2024-03-29 at 11 54 16 AM

This was happening consistently when I was trying to view any user profile page https://app.opensauced.pizza/user/jpmcb:

Screenshot 2024-03-29 at 11 50 22 AM


Not sure why this is happening. And the logs don't really seem that useful. And I was surprised to see this happen on the user profiles since that page has always loaded really quickly for me.

I fixed it by going onto my VPN and everything worked great. Which makes me think that maybe there's a lambda being hit in my region off my internet address that was in a bad state. Maybe also worth reaching out to Netlify support.

Steps to reproduce

N/a - happens intermittently ... difficult to know why this happens.

Do you have any images or screen captures?

No response

Browsers

Firefox

Additional context (Is this in dev or production?)

No response

Code of Conduct

Contributing Docs

github-actions[bot] commented 7 months ago

Thanks for the issue, our team will look into it as soon as possible! If you would like to work on this issue, please wait for us to decide if it's ready. The issue will be ready to work on once we remove the "needs triage" label.

To claim an issue that does not have the "needs triage" label, please leave a comment that says ".take". If you have any questions, please reach out to us on Discord or follow up on the issue itself.

For full info on how to contribute, please check out our contributors guide.

jpmcb commented 6 months ago

Netlify support provided the following guide:

https://answers.netlify.com/t/support-guide-why-is-my-function-taking-long-or-timing-out/71689

which may be of interest and I'm not sure if we've looked at this before. Still very strange to me that these seem to work ok and then crash from time to time. In the guide it provides the following:

Everything was running smooth… What happened?

Your code is taking longer to execute due to some condition we () cannot predict or control (e.g. slow third-party API response). Your code may have been running at its limit before and all it took was a small delay to cause the failure. Your code does not reliably exit

Looks like there are a few things from that guide we can look at.

nickytonline commented 6 months ago

When using the Essential Next.js Build Plugin 25, it creates three Netlify functions 7 that handle requests that haven’t been pre-rendered. These are netlify-handler (for SSR and API routes), netlify-odb-handler (for ISR and fallback routes), and _ipx (for images).

These get generated for our project during a deploy since we're using Next.js

CleanShot 2024-04-01 at 12 58 40

If they're running slow, it's either slow calls to our API (doubt it after a couple convos with @jpmcb) or something in app code. The odd thing is this is happening more frequently now.

It might correlate to the uptick in users from GitHub education possibly, still not sure. Regardless, adding some HTTP caching should improve these load times.

brandonroberts commented 6 months ago

We discussed some caching opportunities for user profiles here also. Currently we fetch two things on a user profile on the server render

The social card image could be removed from the server side fetching, and the user information is already pretty fast. Along with adding some caching with SWR could improve the reliability of the page.

open-sauced[bot] commented 6 months ago

:tada: This issue has been resolved in version 2.17.0-beta.1 :tada:

The release is available on GitHub release

Your semantic-release bot :package::rocket:

open-sauced[bot] commented 6 months ago

:tada: This issue has been resolved in version 2.17.0 :tada:

The release is available on GitHub release

Your semantic-release bot :package::rocket: