beenotung / ts-liveview

Build hybrid SSG and SSR realtime SPA/MPA with Typescript
https://liveviews.cc
BSD 2-Clause "Simplified" License
168 stars 2 forks source link

demo: http/2+ may make first page load faster (and more) #4

Closed hpvd closed 2 years ago

hpvd commented 2 years ago

demo: using http2 for all resources may make first page load even faster (maybe also protocol switch to ws) (yes of course this is not the focus of the demo, but it helps on overall impression ;-)

actual state: 2022-05-11_11h39_38

2022-05-11_12h30_48

beenotung commented 2 years ago

This is good suggestion, it seems trivial to make express work with http2, but I'll study how to make websocket (currently using ws on server-side) to work with http2.

Also, the demo server is running behind http-proxy, I will upgrade it to use http2-proxy to make the whole thing work.

beenotung commented 2 years ago

I'm reading about http2. As my understanding, the benefit of http2 is it allows the server to actively push resources (js/css) to the browser's cache upon the first GET request, hence allow the whole download process to be finished earlier.

However, if we inline the script and style in the initial http response, the benefit would be minimal?

Also, we need websocket to pass client events to server, and to pass DOM update commands to the client, which seems not supported by http2. If we need to fire another http request to upgrade to websocket, it may be not beneficial overall?

beenotung commented 2 years ago

I prefer to use websocket to pass client event instead of using ajax, to save the overhead of http request. I learn that the http headers are encoded in binary format with better compression in http2, but it is still more overhead than sending a websocket message?

beenotung commented 2 years ago

For first time request, inline style and script may be beneficial, but for subsequent visits, having the styles and scripts in separate file may be more cache-friendly (and be able to leverage the benefit of http2?)

hpvd commented 2 years ago

thanks for looking into details!

yes you are right, one main advantage of http2 is server push. But as far as I can overlook it, using websocket should still deliver superior speed.

imho there should be 2 other advantages where your ts-liveview would benefit from http2, but only on first page load:

when looking into the future: http3 should the winner...

hpvd commented 2 years ago

For first time request, inline style and script may be beneficial, but for subsequent visits, having the styles and scripts in separate file may be more cache-friendly (and be able to leverage the benefit of http2?)

I would prefer seperate files, since in real world there is also css and maybe some other js file for animations or at least changing the very light "preview image" to the full images when this has finished loading in background...

hpvd commented 2 years ago

just stumbled on this detailed answer to the question: does http2.0 make websocket obsolete? (the answer was constantly updated improved from 2015 to 2021) https://stackoverflow.com/questions/28582935/does-http-2-make-websockets-obsolete/42465368#42465368

beenotung commented 2 years ago

The stackoverflow post also talked about server-sent-event. SSE looks good but it has a per-domain limitation (at most 6 connections), so it won't work well if there are multiple tabs opened. But it seems this limitation is not imposed on http2 connections?

beenotung commented 2 years ago

If SSE is usable, it may be more preferable, because it is more reliable. With websocket, we need to manually detect and resent for dropped messages.

hpvd commented 2 years ago

hmm...

https://caniuse.com/websockets vs https://caniuse.com/eventsource are these the right ones? details: https://html.spec.whatwg.org/multipage/server-sent-events.html#server-sent-events

hpvd commented 2 years ago

if one really think about changing WS for another tec -> maybe one should also directly look into http 3.0 (for direct use or to prepare an upgrade path)?

https://www.cloudflare.com/learning/performance/what-is-http3/

hpvd commented 2 years ago

maybe not yet, but soon ;-) https://caniuse.com/?search=http%2F3

hpvd commented 2 years ago

http/2 in node.js: https://www.section.io/engineering-education/http2-in-nodejs/

hpvd commented 2 years ago

in this 3 part series, some good details are described, not only about http/3 as the article suggests ,but also (indirectly) about http/2 https://www.smashingmagazine.com/2021/09/http3-practical-deployment-options-part3/

hpvd commented 2 years ago

last one: great tech detail about the benefits of each http version 1/2/3 : https://www.toptal.com/web/performance-working-with-http-3

hpvd commented 2 years ago

really the last comment ;-)

DEMO:

one of the fastest sites on first load I could find (site is not light, using fast connection/same as for https://github.com/beenotung/ts-liveview/issues/4#issue-1232343236 )

https://pulsar.apache.org/case-studies/

(of course, measured latency also depends on the distance: browser - server/datacenter)

2022-05-16_13h49_33

2022-05-16_13h08_59

hpvd commented 2 years ago

detail comparison for the first element:

of course, measured latency also depends on the distance: browser - server/datacenter

but the structure is different between http/1.1 with TSLv1.2 and http/2.0 with TSLv1.3

2022-05-16_13h56_21

hpvd commented 2 years ago

sorry for spamming (->maybe moving from issues to discussions?), but this may be interesting?!

regarding using http/3 and "fallback" to http/2 to support all browsers: https://stackoverflow.com/questions/61172217/does-http3-quic-fall-back-to-tls-1-2-if-the-browser-doesnt-support-quic

and a minimal server config for even further clarification (yes it's for nginx and old but the main principle should keep the same..): https://blog.cloudflare.com/experiment-with-http-3-using-nginx-and-quiche/

=> with this in mind, it may not be too early to think about http/3 :-)

beenotung commented 2 years ago

Thanks for the updates. It seems safari doesn't support http3 yet.

The example on how to push static content with express is helpful. Your example used spdy instead of http2 module to create the web server, It seems the different between spdy and http2 is they used different algorithm to compress the http header.

With http2, the server actively push the static resources (css/js) to the browser's cache, it seems to have similar performace as inlining the styles and scripts in the http response. The overall performance for initial page load should be similar?

For subsequence routing, when the user switch to another view, part of the dom should be updated, and the new view will require new styles. In that case, if the dom update command is sent from websocket, and the styles get pushed from the previous http2 connection, the setup seems more tricky than having the style inline with the rest of dom elements (in json over ws).

The server push behaviour may be beneficial to push images referenced by the img elements thought.

In this case, maybe it's better to run the server with multiple versions of http at the same time, and upgrade the connection when supported [1]

Thanks again for the links, it seems it would be benefial in turns of performance after switching to http3 (quic over UDP) even when we do not leverage the server push feature, because using QUIC we can have shorter handshaking overhead and resend lost packages in lower layer (hence earlier when it is needed) [2]

Also, the seems great that QUIC server doens't have to listen on port 443 when the http1/http2 server response with header Alt-Svc: h3=":_the_port_".

hpvd commented 2 years ago

good to read that there were some interesting points included :-)

just some advertising for 3 tiny tools: When working in these field and using firefox, there are three addons perfectly helping to get an overview of the needed infos regarding current

2022-05-21_21h26_57

https://addons.mozilla.org/de/firefox/addon/indicatetls/ source: https://github.com/jannispinter/indicatetls

https://addons.mozilla.org/de/firefox/addon/http2-indicator/ source: https://github.com/bsiegel/http-version-indicator

https://addons.mozilla.org/de/firefox/addon/websocket-detector/ source: https://github.com/mbnuqw/ws-detector

hpvd commented 2 years ago

examples: using these tools one easily stumble across

2022-05-21_22h22_34

beenotung commented 2 years ago

Thanks for sharing the tools. I was not awared the linked image was using older version TLS.

I'll update the link with https image proxy like https://images.weserv.nl This proxy server is using TLS 1.3

hpvd commented 2 years ago

just another thought on relying on 2 different connection types (http and ws): you have to take care for security and abuse of 2 different stacks. Stumbled across this topic when searching for background on how to protect against DDoS and unfriendly bots...

for ws and security this one was interesting: WebSocket Security: Top 8 Vulnerabilities and How to Solve Them https://brightsec.com/blog/websocket-security-top-vulnerabilities/

beenotung commented 2 years ago

I'm considering sse over http2 vs ws.

When using sse (server-side event) over http2, it seems there can have at most 100 connections among all the tabs, which seems plenty. The EventSource in browser will auto reconnect when the network fails, and auto ask for missing events between two connections.

Even if we need to fallback to http 1 for some browser, we can workaround the maximum 6 connection limit with storage event [1]

However, it may incur more latency on the interaction as sse doesn't support sending messages from the client side (hence will need to use ajax to send events from client to server with additional http header in the request and response)

[1] https://fastmail.blog/historical/inter-tab-communication-using-local-storage/

hpvd commented 2 years ago

very interesting details and background.

I'm not sure regarding the fallback and workaround described in [1] from 2012. Maybe modern browsers (2022) would throttle tabs/connection in background aggressively to save battery on mobile devices - this may break the master tab solution suggested in [1] or at least add additional complexity... since these things highly depend on browser type and could change over time... just an impression on this topic https://news.ycombinator.com/item?id=13471543

Regarding latency, one may need a test... of course there was a great reason why ws was choosen by Phoenix LiveView (and you): latency, but also some others.. The question is:

hmm everything is a compromise :-)

hpvd commented 2 years ago

just looked through the linked sources again:

http/2 seems to support full bidi streaming, so there seems to be no latency disadvantages (no need for ajax)

as far as I get, one need to use:

Details:

Articles like this (linked in another answer) are wrong about this aspect of HTTP/2. They say it's not bidi. Look, there is one thing that can't happen with HTTP/2: After the connection is opened, the server can't initiate a regular stream, only a push stream. But once the client opens a stream by sending a request, both sides can send DATA frames across a persistent socket at any time - full bidi.

That's not much different from websockets: the client has to initiate a websocket upgrade request before the server can send data across, too.

...

If you need to build a real-time chat app, let's say, where you need to broadcast new chat messages to all the clients in the chat room that have open connections, you can (and probably should) do this without websockets.

You would use Server-Sent Events to push messages down and the Fetch api to send requests up. Server-Sent Events (SSE) is a little-known but well supported API that exposes a message-oriented server-to-client stream. Although it doesn't look like it to the client JavaScript, under the hood your browser (if it supports HTTP/2) will reuse a single TCP connection to multiplex all of those messages. There is no efficiency loss and in fact it's a gain over websockets because all the other requests on your page are also sharing that same TCP connection. Need multiple streams? Open multiple EventSources! They'll be automatically multiplexed for you.

this and more details in: https://stackoverflow.com/a/42465368

edit: if this can really be confirmed and it works like this, I was exactly right yesterday with my casual comment "could it be matched by others protocols and advanced configurations" :-) -> at first, the client has to open the stream, similar to how the client has to initiate the WS upgrade... (so you could/should keep even the WS indicator badge on your demo, it only has to be renamed to: stream open)

beenotung commented 2 years ago

It would be very interesting if we can do bidirectional streaming with http2 (fetch push from client, event source push from server or streaming response to the previous fetch)

I'm following the demo on https://web.dev/fetch-upload-streaming. The demo used TextDecoderStream and stream.pipeThrough() which seems not supported by majority distribution of firefox but it seems possible with other approaches.

If it really works, the performance will be improved and security part would be easier to cater!

beenotung commented 2 years ago

Update: I cannot get client-to-server stream work with http1/http2 yet, the body sent from Firefox and Chrome appear to be stringified object [object ReadableStream], not the actual stream content.

Maybe it need to be multiple ajax instead of a single streaming ajax at the moment?

hpvd commented 2 years ago

hmm does this work?

After the connection is opened, the server can't initiate a regular stream, only a push stream. But once the client opens a stream by sending a request...

could you open a stream after the connection is established starting from the client site to the server?

hpvd commented 2 years ago

some more background on http/2 bidi streaming: https://web.dev/performance-http2/#streams-messages-and-frames

hpvd commented 2 years ago

This looks interesting: A Complete Guide to HTTP/2 in Node.js (With Example Code) https://www.sohamkamani.com/nodejs/http2/

beenotung commented 2 years ago

https://www.sohamkamani.com/nodejs/http2/

This guide is show how to create http2 stream from node.js, which is helpful.

I'm still looking way to open the stream from browser. Maybe I misunderstood, it seems the browser doesn't expose the http2 stream API, do we open a http2 stream from browser with ordinary ajax request?

hpvd commented 2 years ago

Streaming requests with the fetch API: https://web.dev/fetch-upload-streaming/

beenotung commented 2 years ago

Streaming requests with the fetch API: https://web.dev/fetch-upload-streaming/

From the website:

If the browser doesn't support a particular body type, it calls toString() on the object and uses the result as the body. So, if the browser doesn't support request streams, the request body becomes the string "[object ReadableStream]". When a string is used as a body, it conveniently sets the Content-Type header to text/plain;charset=UTF-8. So, if that header is set, then we know the browser doesn't support streams in request objects, and we can exit early.

It seems the latest available versions of firefox (v101) and chrome (v102) on archlinux aur doesn't support request streams at the moment.

Will keep exploring alternative approaches :muscle:

hpvd commented 2 years ago

interesting, but there should be a way, since:

fetch api to send requests up 93% coverage at can-I-use https://caniuse.com/mdn-api_fetch and there seems to be also a polyfill if one really!? want/need more browser support (e.g. IE10) https://github.com/github/fetch

hpvd commented 2 years ago

last link: https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch

hpvd commented 2 years ago

there is also a compatibility card: https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch#browser_compatibility

hpvd commented 2 years ago

and there is also a link to many fetch examples: https://github.com/mdn/fetch-examples/

hpvd commented 2 years ago

another reason for using http/2:

you do not have to rely for speed of first meaning full paint on possibly unsave inlining of resources see https://github.com/beenotung/ts-liveview/issues/8#issuecomment-1146742353

hpvd commented 2 years ago

is there anything one can do to help on this topic, beside

like tried/done above?

Would be glad to help :-)

beenotung commented 2 years ago

I generally agree http2 is better than http in turns of performance.

Just need to update my deployment setup (local port forwarding http proxy) to allow the a liveview app running https directly (instead of relying the proxy to provide https).

beenotung commented 2 years ago

meanwhile, I'm referring this article on how to develop with https locally

hpvd commented 2 years ago

This is pretty interesting! Thanks!

beenotung commented 2 years ago

The demo site is now working with http2 (handled by the proxy in-front of it).

And I can get websocket (using http/1.1) working with http2 server when testing locally.

Will consolidate everything then include in the next release under current major version :muscle:

hpvd commented 2 years ago

sounds great :-)

beenotung commented 2 years ago

now ts-liveview comes with http2 support out-of-the-box in v4.4.0 :tada:

Thanks for your suggestions and encouragements!

hpvd commented 2 years ago

just for comparison some numbers: looks like it was worth it:-)

(sorry for different scaling factors)

2022-08-08_14h40_09

beenotung commented 2 years ago

With http2, it seems to have less latency even when the payload is increased.

Between two version, I also disabled compression on the deployed version. In the original deployment, compression was performed twice, once between public network and proxy, once between the proxy and the web server. (Now the connection between web server and proxy is not compressed)

beenotung commented 2 years ago

Now with http2 enabled, we can investigate the potential benefit of using ajax+sse vs websocket with lost message handling.

hpvd commented 2 years ago

Now with http2 enabled, we can investigate the potential benefit of using ajax+sse vs websocket with lost message handling.

yeah!

Following the latest sources mentioned from June 2+, I still believe there is a way to use

:-)