ember-fastboot / fastboot-app-server

A production-ready app server for running Ember FastBoot apps
140 stars 74 forks source link

Slow response times on Heroku #110

Open allthesignals opened 4 years ago

allthesignals commented 4 years ago

image

Hi there – I'm looking for some guidance on improving this.

One of the benefits of fastboot is faster response times, but I've seen some routes take almost 7 seconds to load from the server. Is this not a fastboot issue? I am using this app server as such:

// frontend-server.js

const FastBootAppServer = require('fastboot-app-server');

const server = new FastBootAppServer({
  distPath: 'dist',
  gzip: true, // Optional - Enables gzip compression.
  host: '0.0.0.0', // Optional - Sets the host the server listens on.
  chunkedResponse: true, // Optional - Opt-in to chunked transfer encoding, transferring the head, body and potential shoeboxes in separate chunks. Chunked transfer encoding should have a positive effect in particular when the app transfers a lot of data in the shoebox.
});

server.start();

My Procfile:

web: node --optimize_for_size --max_old_space_size=460 --gc_interval=100 frontend-server.js

There are a lot of variables that I'm struggling to isolate. It could be my application code leaking memory, an addon leaking memory, the heroku configuration lacking the correct size, the particular buildpack I'm using, the way I've setup my FastBootAppServer initialization, or the other tooling I'd need.

I'm able to get this to work reasonably well locally, but I am spinning my wheels on the deployment step.

CvX commented 4 years ago

Yeah, with such high memory usage (and swap usage in particular) response time will definitely skyrocket. It could be a leak that manifests itself only in FastBoot, but I'd check how's the usage in the browser first.

Here's the memory usage of my production FastBoot app (with similar FastBootAppServer and node options) for comparison:

One of the benefits of fastboot is faster response times

Unless you do any kind of caching, FastBoot doesn't make responses any faster. Slower, if anything, because it puts one more stop between the browser and your backend server.

allthesignals commented 4 years ago

@CvX thank you for that context! It's good to see an example of typical memory usage with this particular brew of technologies.

One follow-up question I have is: what are the pros and cons of setting up a CDN for the build output, using the s3 notifier and the downloader for fastboot app server? Faster deployment turnaround? If your app is serving many different parts of the world (mine is only the NYC area), perhaps CDNs make sense there because they serve assets more proximately to the requesting client?

And yes, my assumption about what fastboot is getting me w/r/t response time is wrong – it's good to know, though, what fastboot does and doesn't cover. I've heard people use Varnish for caching? I think what needs to happen is I need to isolate memory leak and performance issues I'm seeing in my app on the frontend. That has been a labyrinthine experience but maybe with more time on it I'll narrow it down.

I'd check how's the usage in the browser first

What kind of approach to this would you take? Heap snapshots over tests? Running a performance profiler, improving component load, then benchmarking against more heap snapshots? I've been working through this but the process feels a bit like reading tea leaves (bc I'm not good at it yet).

Random thoughts: