Closed i336 closed 7 years ago
Just looked into this, I can reproduce it here, it seems to be a web server issue.
The web server was setup with a very simple Dao script (40 LOC) using a http module that might have performance issues. And it is hosted on a single-core virtual server instance on Aliyun. I tested the http module before on physical servers with multiple cores, the performance was acceptable if I remember correctly. But on a virtual server with single core, it could be a problem. Also the content is always served from disk, this could also be problematic. I will see if something can be improved.
It turns out the web server crashes when serving big files such as dao.js. Now I updated Dao and the http module with some improvements. This should have fixed the issue.
:D it works now!
There's just one other thing I think I should mention.
(Oh - with caching actually turned on (woops!) it drops to 33 seconds.)
This is due to the slow-ish download speed:
jquery-1.9.1.min.js 100%[=====================>] 90.46K 141KB/s in 0.6s
ace.js 100%[=====================>] 288.59K 211KB/s in 1.4s
dao.js 100%[=====================>] 2.89M 139KB/s in 23s
dao.js.mem 100%[=====================>] 321.95K 146KB/s in 2.2s
FWIW, (141+211+139+146)/4 = 159KB/s overall average, but in flight the transfers tried to stay near 160-180KB/s.
I don't know whether this is due to
network latency/distance issues (I don't know what region your VPS is in so I can't measure the distance from Sydney)
Aliyun-plan-specific traffic shaping, or
Performance/throughput caps in the Dao-based HTTP server you're using
My ADSL2+ connection gets around 11Mbps. Under ideal circumstances I know I can download at around 1.15MB/s.
Regardless of the cause, one solution could be to move dao.js
, dao.js.mem
, ace.js
and jquery-1.9.1.min.js
(~3.6MB) onto GitHub Pages, which is backed by a global CDN.
Normally, GitHub Pages is used to serve complete websites. I'm not quite sure if any other projects "borrow" the functionality just to serve isolated JavaScript, and what GitHub's view of that is. Access-allow-origin
is *
so there's that, I don't know how "locked in" that is.
For what it's worth, I don't think it would be a problem to serve the Dao demo JS via GH Pages - the traffic honestly wouldn't make any sort of dent. Of course, you could always move the entire demo over to github.io, in which case the files would be being served within the context of a website and there wouldn't be any concern.
Alternatively, https://rawgit.com/ serves data directly from GitHub (not Pages), via a CDN, with proper content-type (as you probably know, GitHub's "raw" link serves everything as text/plain
). This is a "99%" solution in that it has no SLA and it's a free service, but it's a pretty awesome one and could work well for Dao.
A couple other things.
First, for the ~30-40 seconds the page is loading, this is what I see. Some kind of "Loading..." indication could be cool.
Secondly, just in case it's useful (it may not be, that's not a problem), here are the errors and warnings I see when I load the page.
Finally - visit the website via IP, not hostname, and consider whether it would be useful to add a 0-byte index.html. FWIW, I see no urgent reason to insist on one being added, but again, I'm mentioning this in case it's useful.
I don't know whether this is due to
network latency/distance issues (I don't know what region your VPS is in so I can't measure the distance from Sydney)
Aliyun-plan-specific traffic shaping, or
Performance/throughput caps in the Dao-based HTTP server you're using
I think it is due to Aliyun bandwidth limit. The service package I bought supports maximum 1Mbps downloading, 1MBit/8=128KByte/s, which is close to the download speed of dao.js.
On the server, I can download dao.js almost instantly:
I will consider upgrading the service, or using one of the solutions your suggested.
Some kind of "Loading..." indication could be cool.
I will try to add this soon.
Secondly, just in case it's useful (it may not be, that's not a problem), here are the errors and warnings I see when I load the page.
Fixed the errors for SetTimeZoneCookie(). dao.js.mem is generated by Emscripten, not sure how to fix it.
Finally - visit the website via IP, not hostname, and consider whether it would be useful to add a 0-byte index.html
Done.
Thank you very much for reporting and looking into this issue:)
Huh. To be honest, I initially thought I was dealing with a latency problem, then I thought that maybe the HTTP server had a throughput limit. I was pretty sure the traffic shaping possibility was unlikely! This was a good reminder to always keep an open mindset :) - and it's very cool to see that the Dao HTTP server is not performance/bandwidth-limited :D
I'm unsure of any other simple/cheap(/free :P) ways to make (properly-mime-typed) code wind up on a CDN besides GitHub Pages or rawgit. At the end of the day, it makes a lot of sense to serve large static files via a fast service designed for that, and point actual processing endpoints at "real" compute logic. Indeed I was poking around Google App Engine recently to learn more about how it worked and found that it has specific provisions for static file hosting that use different resources (with much lower costs) than their actual computation services. A properly configured app wouldn't use dynamic routing (through a PHP, Go, Java backend etc) to serve static text that would never change. This is a very similar scenario.
The "loading" indication is probably a good idea - in my case, once load times are reduced, I suspect the page will load in about 3-5 seconds, which is pretty fast to me. Other slow connections will still see slowdown though, so indicating that the page is loading is probably a good idea. I'm not aware of any easy way to add progress indication. It's possible the links in the second message in this thread might be relevant, but I'm not sure. I also found this thread but that seems less useful. It's sad Chrome has moved away from the era of fixed loading progress bars...
Also, to clarify, I meant Access-control-allow-origin
in my last message. And by "locked in" I meant that I was unsure if GitHub Pages had any sort of guarantee to preserve that access-control parameter, or if it would be turned off in the case of possible future security incidents, "borrowing" the CDN, or similar. (The point being that if CORS is disabled, sites that serve JS off a single github.io domain/project - the canonical GitHub Pages use-case - will still continue to work, and only things that try to be fancy will be affected. Obviously the header has been explicitly added, I'm just not sure if its existence is explicitly documented as anywhere "yes, this is supported, and here is how you are allowed to use it".)
I think Emscripten is loading dao.js.mem
as JavaScript just to get it into the browser cache while the main blob downloads so that everything's downloading in parallel, and then emscripten reloads the file via XHR directly out of the cache when it's needed. One way to fix the error would be to encapsulate the binary data into Unicode text and then wrap the whole thing inside a comment, but that's outside the scope of this bug :P
Finally, regarding the IP bit, that was why I mentioned "the Dao-based HTTP server you're using" in my original message :) - and okay, if a 0-byte index.html is a good idea, I wonder if a redirect to the hostname would be even more complete. Right now reverse DNS doesn't map the IP to the hostname. I think this is a matter of preference.
Just upgraded Aliyun service package to support 5Mbps bandwidth. Now the demo page can load in a few seconds:). An indicator is also added to that page.
I wonder if a redirect to the hostname would be even more complete. Right now reverse DNS doesn't map the IP to the hostname.
This sounds better:)
The crashing issue of the web-server has been solved some days ago, no more crash has been observed since server restart. I think this issue is solved.
Just wanted to ACK the last couple of messages.
I just did a very anecdotal test, counting out seconds starting immediately after hitting Enter on my laptop (with quite a few tabs open). The spinner disappeared and everything was ready after 12 seconds.
It would seem my test was actually rather empirical though - devtools says the page reloaded (with cache disabled) within 13 seconds, heh.
But, thing is, I see a loading spinner that tells me the page is working after just 3 seconds! Again, on a slightly old machine with several hundred tabs open. I agree; very well fixed. I browse a lot of websites that take far longer than 12 seconds to load.
It's great to hear the crash issues have been fixed too!
Stumbled on Dao a few days ago while doing one of those periodical broad sweeps for interesting programming languages :)
I wanted to share a weird network problem I'm experiencing with the online demo. I see this:
Initially I thought that maybe that was what I was supposed to see, but the buttons and submit button do not do anything. So I immediately opened the devtools, and was presented with
Huh?!
So I reloaded to have a look at the network tab, and
Wat.
Incidentally the process of reloading the page made the console show different errors.
Some of the network errors are different this time.
So I right-clicked one of the .js files, ace.js, to open it in a new tab. And...
It works. It's fine. I can even wget dao.js (a 2.9MB wall of text is a bit much for Chrome :D) and that completes fine too.
Repeatedly reloading the page consistently produces the same errors though.
So, off to
chrome://net-internals
for further introspection!The vaguely centered blue request for
dao.js
is on the left, with the "O.o?!" at the bottom-right: the connection was reset.Chrome strips identifying and response info from ://net-internals, so I used chrome://net-export to save an unredacted log of everything. Unfortunately that didn't really highlight anything interesting.
You may have also noticed the "content-length mismatch" errors with jQuery. Chrome gets partway into the download and then unexplainably falls over:
And yet I can also open jquery-1.9.1.min.js file in a new tab and it loads just fine.
Unfortunately I have no fun debugging story - it's yet to be written :)
I'm quite mystified by what I'm seeing here, and very curious as to what's going on. I tried using a website screenshotting service on the page in order to test the site on an independent browser on an independent network (I don't have any machines/VPSes/etc I can borrow right now), and interestingly, I see exactly the same thing there as I see locally:
That suggests the possibility that this is not specific to my network (Telstra ADSL2+ in Australia) or my browser (Chrome 58.0.3029.96 - I should upgrade soon, I know).
If you can't reproduce this on your end I can pursue this a bit more, perhaps I can chase down the Chromium guys. I can copy all the headers my browser is sending to curl and the requests execute 100% fine.
I wonder if the Web server you're using has any race conditions regarding multiple connections, perhaps from the same IP?