ccampbell / sonic

fast, lightweight PHP 5.3 MVC framework
http://www.sonicframework.com
Apache License 2.0
63 stars 11 forks source link

Issues with CSS in in Turbo mode #15

Open gordyr opened 12 years ago

gordyr commented 12 years ago

My apologies for posting another issue so soon.

I seem to have an issue regarding CSS in turbo mode.

My understanding is that each pagelet should not be displayed until it's CSS has been downloaded and parsed however for me this seems to not always be the case, resulting in the occasional css 'flash'

My main layout loads a barebones css file with the css for only the header and the main layout divs (leftcolumn, rightcolumn etc.)

Then inside one of the template I am using the addCss method as instructed to load the css for that particular template. However, the template is occasionally visible before the css has been parsed. Obviously I can easily get around this with javascript simply by setting each pagelet to start up with display:none and then 'show' them on load but this kind of defeats the point. Am I missing something?

Secondly, and more importantly. In IE I am only getting the css for the top pagelet. If I clear IE's broswer cache it is fine for the first load, then for subsequent loads it appears that the individual pagelets are cached and the css for each one does not get downloaded. I've played around with the usual headers to avoid caching in IE but nothing seems to work.

Is there a workaround you know of?

Thanks... This aside, I am absolutely blown away by your implementation of the Facebook's BigPipe. The user responsiveness is utterly astounding. :-)

ccampbell commented 12 years ago

This is interesting. Honestly I haven't done too much with turbo mode. I was just going over the code.

It's strange because it goes:

_addCss(data.css);
doc.title = data.title;
doc.getElementById(data.id).innerHTML = data.content;

for (var i in data.js) {
    _js_queue.push(data.js[i]);
}

_processQueue();

It adds the css and then when all of that is done it sets innerHTML so I'm not sure why it would be loading the content before the css.

As for the IE thing, what IE version? It might be something related to the file name. I've seen weird things when adding assets dynamically. Try doing something like

<?php $this->addCss('test.css?' . time()) ?>

That's obviously not a perfect solution cause it means the css file will never be cached, but maybe it could only be done on dev.

Feel free to play around with the javascript. It's definitely not perfect.

gordyr commented 12 years ago

Thanks Craig, i'll give your suggestions a try. From poking around the file it seems odd to me also.

My thoughts are that it is probably the time between the file being injected into the DOM and the completion of parsing by the browser, which can be several milliseconds later with large CSS files.

To work around this would require some kind of custom event to fire when the CSS has finished parsing enabling the innerHTML to be set then and only then.

I just testing with a fake super large CSS file and the issue became more noticeable so this definitely seems to be the case..

I will likely do some work on the JS as suggested. I have a couple of ideas already.

I haven't dug too far into your turbo implementation as far as the server side stuff goes.... But. it appears that the current implementation builds each pagelet then flushes the buffers. Which of course results in a lovely increase in user percieved responsiveness.

However, you could take this a giant leap further forward by processing the pagelets asynchronously as my understanding is that PHP by nature does not do this.

However Curl does have multi threaded request support and I believe this is how facebook not only improves responsiveness but actually reduces page load time BigPipe.

See the following links:

http://www.somacon.com/p537.php http://www.rustyrazorblade.com/2008/02/curl_multi_exec/ http://code.google.com/p/multirequest/

My understanding is that they keep the connection open (You can see this by checking a waterfall profile in firebug on any facebook page) and make multiple requests at the same time to each pagelet. Hence the greater emphasis on dependency management and the need for the load order parameters that they seem to have on many of their pagelet initialisation calls.

My knowledge of Curl/PHP is nowhere near good enough to implement this myself however the javascript side of things I would potentially be able to contribute to. I'll probably start looking at some improved dependency management within your turbo script tomorrow. Currently if I have 2 views in a layout which each share a dependency, the file is requested twice (although obviously the second request is browser cached) But this still slows down the browser as the javascript is parsed twice I believe.

I think I have some code myself already that I can pretty much drop in to alleviate some of this.

Anyway... Just some thoughts (although not related to this issue, my apologies) as to how this project could move forward. It's the most complete BigPipe implementation I have seen and is fantastic in its current incarnation, but there's always more we can do. :-)

ccampbell commented 12 years ago

I'm all for processing them asynchronously, but as you said PHP does not offer support for this.

I think the whole point of Facebook's implementation is that they do not do what you are proposing. Even if using curl_multi could be faster it adds extra http overhead because instead of having 1 request it means that you have a lot of requests. Also state would have to be recreated for each request (for example checking if the user is logged in or anything else that may need to carry on to the later part of the request).

I think the server side end is pretty good as is. The time until the user sees the page is fast and then the slower parts are just to fill in the page fragments. It means the user doesn't have to wait for a slow database query to finish before he/she can interact with the page.

Also you have to remember this isn't some magic solution. This is just a hack to make the page feel faster to the user. The actual time for the whole request is slower. Also you have to look where you are optimizing. For example if things are really slow perhaps there are other ways to make them faster :P

gordyr commented 12 years ago

Oh don't get me wrong, things are fantastically fast. My app is averaging 500ms page loads, complete with all JS parsing etc... and there is a LOT of JS as amongst other things I am replicating pro-tools/cubase like features in real time on mp3 files.

Volume automation,EQ etc... all in javascript as you play a song.

Using your framework has knocked around 300-400ms off dependant upon the browser. So I'm certainly not criticising in any way. :-)

I have an annoying need to customise and improve everything I can get my hands on even, especially when I really like a project... And Sonic definitely qualifies there.

I just wish I had a better php background so I could be more useful and at least throw a few ideas around.

Anyway... It was just a thought that occurred to me while playing around today. Coupled with the fact that they place such an emphasis on 'pipelining' in their blog post makes me think that somehow in some manner, they are pulling these pagelets down asynchronously.

This might be an interesting read for you:

http://www.olympum.com/java/facebook-bigpipe-in-an-async-servlet/

EDIT: And this one is even an even better link describing asynchronous http transfers within a single request.

http://www.cubrid.org/blog/dev-platform/faster-web-page-loading-with-facebook-bigpipe/

Interesting stuff!

You're quite correct to say it isn't necessary as Sonic is blisteringly fast as it is. But it's nice to play with stuff like this sometimes. :-)