a-b-street / osm2lanes

A common library and set of test cases for transforming OSM tags to lane specifications
https://a-b-street.github.io/osm2lanes/
Apache License 2.0
34 stars 2 forks source link

Add Examples To Website #185

Closed droogmic closed 2 years ago

droogmic commented 2 years ago

Add a convenient dropdown to website for good examples

droogmic commented 2 years ago

My vscode did something to the .css file, I don't know why :/

droogmic commented 2 years ago

After: image

droogmic commented 2 years ago

Before: image

droogmic commented 2 years ago

You can really see how the yaml deserialization consumes about 1.1s of load time...

droogmic commented 2 years ago

https://gtmetrix.com/reports/a-b-street.github.io/gVqO8od3/

dabreegster commented 2 years ago

Grade A, nice! Deferring the YAML load makes perfect sense to me then.

By the way, if you happen to be familiar with how to handle this kind of problem in web browsers, do you know of the "proper" way to load/parse large blobs of data without freezing things up? Downloading a large file and showing a progress bar is possible to do async by fetching in chunks. But parsing it blocks the CPU. Do any/many web apps have a nicer UX? The only two ideas to workaround are to try web workers (which don't seem to be easy to manage from Rust, though I've spotted some more recent crates that maybe help) or to use some kind of streaming serialization that can limit itself to a real deadline, yield, and let the caller update a progress bar or give the user control to interact with the page. For the latter idea, serde doesn't seem to help directly. But if the large blob is a list of stuff, then one approach is to just try and deserialize one item at a time. I haven't seen any real examples doing this though...

(All of that's an aside; this PR looks fine to me -- great new feature)

droogmic commented 2 years ago

By the way, if you happen to be familiar with how to handle this kind of problem in web browsers, do you know of the "proper" way to load/parse large blobs of data without freezing things up? Downloading a large file and showing a progress bar is possible to do async by fetching in chunks. But parsing it blocks the CPU. Do any/many web apps have a nicer UX? The only two ideas to workaround are to try web workers (which don't seem to be easy to manage from Rust, though I've spotted some more recent crates that maybe help) or to use some kind of streaming serialization that can limit itself to a real deadline, yield, and let the caller update a progress bar or give the user control to interact with the page. For the latter idea, serde doesn't seem to help directly. But if the large blob is a list of stuff, then one approach is to just try and deserialize one item at a time. I haven't seen any real examples doing this though...

I'm not an expert, basically a no-stack developer... I do think webworkers are the way, I was just waiting for good integration with trunk... https://github.com/thedodd/trunk/pull/285

Oh it's done. OK, maybe I will try to make a web-worker as an example then...

droogmic commented 2 years ago

it's basically https://github.com/yewstack/yew/tree/master/examples/web_worker_fib

droogmic commented 2 years ago

looks simple enough :)

droogmic commented 2 years ago

Still OK after: https://gtmetrix.com/reports/a-b-street.github.io/mX4s9uCK/