web3rover / Qazy

Lazy Loading - No SEO Negative Impact
83 stars 29 forks source link

Doesn't work (?) #2

Open mindplay-dk opened 10 years ago

mindplay-dk commented 10 years ago

Tested in Chrome, and it doesn't seem to work - I look in the network tab, and all the images are downloaded immediately.

Even though your script does pick up the image tags and remove the src attributes, it is doing so when the load even fires - at which point the browser has already started downloading the images.

I could be wrong, but as far as my understanding, this approach had already been tried, and was already known not to work?

web3rover commented 10 years ago

When src attribute is removed browser stops downloading the images. So if images are of large size then u will see the difference. If your images are of something like 10-100KB then u won't see the difference. In that case lazy loading is not even required.

mindplay-dk commented 10 years ago

I emulated a low bandwidth connection while testing your demo page - it continues for about a minute, fully loading all the images.

Even though it does initially replace the images with spinning placeholders, until the images are fully loaded, it does not appear to delay loading as such - images continue to fully load in the background.

How are you testing this, and on what browsers?

web3rover commented 10 years ago

I have tested it in all latest browser releases. Just now I inspected the page in chrome and I see the download get cancelled. Its starts downloading the image when it enters the viewport.

web3rover commented 10 years ago

Many have already integrated it into their websites and its working for them. If you want I could make a video and show it to you.

mindplay-dk commented 10 years ago

I'm puzzled as to why it doesn't seem to work on Chrome for me.

Can you point me to a real site where this was implemented?

netorica commented 10 years ago

can you show us a website of yours how you implemented this?

web3rover commented 10 years ago

please check the demo

mindplay-dk commented 10 years ago

But you said yourself,

If your images are of something like 10-100KB then u won't see the difference

The images in your demo are ~ 25KB each, so the demo page can't actually demo the script.

Also, if I have a page of photos each 640x480 px, those are probably around 100KB each - you say I won't need the script then, but that depends on how many images I have, more so than their individual size.

So this approach only works for a very specific use case - it can't be considered general purpose, since in e.g. most photo galleries or product listing with lots of images, it won't actually be any use. I'm betting this is why most everyone else has abandoned this idea... I wish it were possible to make this work well, but as browsers are right now, I don't think it is.

web3rover commented 10 years ago

You also need to consider the bandwidth speed. Lazy loading(or this library) is useful in two cases:

  1. Slow Internet connection. In this case ~25KB images will be lazily loaded.
  2. Too many images. Even if you have a speed Internet the page loads slower due to number of requests therefore this script is useful as it blocks the requests until they are required.
tonygustafsson commented 10 years ago

There is a reason people skip the src and uses data attributes instead :)

mindplay-dk commented 10 years ago

There is a reason people skip the src and uses data attributes instead :)

Yes, but that approach is wrong - an <img> tag without a src attribute is not a valid <img> tag, so that approach completely defeats the purpose of using <img> tags in the first place. You might as well use <div> or some other tag then, at least then you would be producing valid markup - broken <img> tags have no value in SEO terms anyhow, they can only be understood by the script on the page, to the rest of the world it's just broken markup...

rjgotten commented 10 years ago

Yes, but that approach is wrong - an <img> tag without a src attribute is not a valid <img> tag, so that approach completely defeats the purpose of using <img> tags in the first place. You might as well use <div> or some other tag then, at least then you would be producing valid markup - broken <img> tags have no value in SEO terms anyhow, they can only be understood by the script on the page, to the rest of the world it's just broken markup.

Has there ever been any thorough testing of what Google and other search engines make of simple anchors or meta tags augmented with http://schema.org/ImageObject markup?

E.g.

<figure itemscope itemtype="http://schema.org/ImageObject">
  <meta itemprop="url" content="./images/lazily-loadable-image.jpg" />
</figure>

Seems the most reasonable alternative, really...

tonygustafsson commented 10 years ago

mindplay-dk, Usually we set src to a preload image that is replaced by the JavaScript. I just meant that the browser will load images automatically - the imgs are detected before the JavaScript is loaded, so images will load before that.

rjgotten, That would be awesome. I'm curious about this too.

mindplay-dk commented 10 years ago

Has there ever been any thorough testing of what Google and other search engines make of simple anchors or meta tags augmented with http://schema.org/ImageObject markup?

That is an interesting thought in deed! :-)

tusharmath commented 10 years ago

@mindplay-dk Qazy's implementation can not work. You are right that image will be loaded before this plugin even gets initialized. I would recommend you to use a #! or hashbang instead as per the following blog - http://www.idea-r.it/blog/110/en/lazy-loading-seo-problem

web3rover commented 10 years ago

@tusharmath In that article there are five solutions but non of them are friendly:

  1. Sitemaps: Many websites don't even use sitemaps. Including images in sitemaps requires further development of the site.
  2. No script: Including images in noscript tags is spam for Google. Google doesn't give much attention to noscript tags content.
  3. Escaped Fragments: It also requires further development of your site. No suitable for managed hosting services or wordpress users.
  4. Hijack Links: This solution would suffer problems when less efficient search bots enter your site. And is also a dirty solution.

Keeping all these things in mind I development QAzy. If all images are loaded within few millisecond then its not a problem. But if bandwidth is slow or lots of images then QAzy works as expected.

Don't forget that lazy loading is only for making the page load faster. Nothing more than that.

waqasy commented 9 years ago

Ok, I deleted my first comment, which you might have seen in your inbox.

Anyway I thought I should test more before criticizing others' efforts. So I just found problem. Its about loading the script. You need to host the script in head of site and at faster server so that I will be load before your images get loaded. This is the problem. Instead of linking js file I just dumped the whole script in site head and every time my images' downloading was cancelled. And when entering viewport they loaded well. Which means nothing is wrong with script. Also the base64 gif takes too much time to load. may be we should use link of gif instead of base64 so that it will be downloaded parallel or just don't use gif.

An the reason why images get loaded in case script isn't loaded. because unlike other lazy loading solution Qazy doesn't ask you to replace your image src with gifs. Thats why if script fails to race with images then image overtake it. So the only solution is what I've mentioned above.

@qnimate your statement about size of images in your first reply is little funny and a foolish excuse (:

mindplay-dk commented 8 years ago

Well, this should more or less end the debate:

fiddler

It is in deed "lazy loading" the images - after it has already downloaded them all once. So rather than saving requests, this is actually increasing the total number of requests and bandwidth overhead.

With that said, with very high number of images (several dozen) or very large image size (several megabytes) and a slow connection, you may be saving some bandwidth, although the net total number of requests is still going to be higher.

So does this script "work"? Well, if your use case is lots of images, and if your userbase is primarily users on slow connections, yes, this script does "work" - it'll save some bandwidth and maybe improve the experience for users on slow connections. For users on fast connections, the bandwidth usage will be higher. And in every case, the net number of requests will be higher.

I would not advertise this script as a general purpose "solution" - the provided example doesn't even demonstrate the only meaningful use-case with several dozen fairly large images.

In addition, there is no specification that says browsers have to behave this way - the fact that Chrome is clever enough to abort downloads mid-way is an implementation detail that you really shouldn't rely on. I haven't tested any other browsers (Chrome support is a must for me) but there is no specification or standard prescribing this kind of behavior.

The "modern" way to do this, is with ImageObject meta-tags - combined with <noscript> tags for antiquated clients with no JS support, I have successfully implemented this on a couple of sites by now, and it works like a charm.

IMHO, this approach has no real justification - attempting to stop HTTP requests after the fact is a flawed and unreliable approach. There are proven, better ways.

rjgotten commented 8 years ago

The "modern" way to do this, is with ImageObject meta-tags - combined with

It's what I settled on using as well and I have yet to hear of adverse effects.

I suppose the truly bleeding edge way of handling this would involve the ServiceWorker API to temporarily hold a block on the image downloads, somehow.

GeassCoder commented 8 years ago

mindplay-dk is right. DOMContentLoaded should be used instead of the load event. I wrote some very simple prototype code to implement qnimate's general idea of swapping the img.src attribute on DOMContentLoaded, and have it tested in latest version of chrome, it's working fine.

Here is my prototype code.

blank.jpg is the placeholder image (you may want to cache in browser all the time), my-img.jpg is the real image to be displayed, a button click is used to "mimic" the scroll into view-port for simplicity.

<html>
<head>
<script>

var src, img;

document.addEventListener("DOMContentLoaded", function(event) {
   img = document.getElementById("test");
   src = img.src;
   img.src = "blank.jpg";
}, false);

function foo () {
   img.src = src;
}
</script>
</head>
<body>
<img id="test" src="my-img.jpg" alt="My Image" style="width:300px;height:220px;">
<button onclick="foo();">  click </button>
</body>
</html>
rjgotten commented 8 years ago

swapping the img.src attribute on DOMContentLoaded

-- won't work reliably.

You're forgetting about look-ahead parsers and parallel downloads.

GeassCoder commented 8 years ago

-- won't work reliably.

You're forgetting about look-ahead parsers and parallel downloads.

You are right, I didn't think of that...

Actually now I tend to believe the whole idea of lazy-loading images may be invalid after thinking it over again. Images are not critical rendering resources and should not delay the first rendering or slow down the page anyway. Even if you can really make it initiate the download only after the image has been scrolled into the viewport, since it may take a while to get downloaded and rendered, it may be even worse user experience compared with pre-downloading the images in a non-blocking manner. The only benefit it can provide, as far as I can see, is that the users can save some bandwidth if they don't scroll down at all to see the out of viewport images. But I don't know how many people really care about the data usage for those few extra images, and this is also digressive from the original intent of the project.

mindplay-dk commented 8 years ago

@GeassCoder there's two reasons you'd want to lazy-load images: on mobile devices, to preserve the end user's bandwidth - and on desktop browsers, to preserve your server's bandwidth. It's quite feasible to do so without impacting user experience, by loading far enough below the fold - in fact, it may improve user experience, since there will be overall less server and bandwidth stress, which means quicker delivery. It's also quite possible to do this without impacting SEO. The idea is definitely not invalid - it's just not necessary or beneficial in every given case, it depends on your project and situation.

yellow1912 commented 8 years ago

@mindplay-dk by ImageObject do you mean https://schema.org/ImageObject or something else?

tom10271 commented 6 years ago

It is working in local dev env but not working at all in production server.....

tom10271 commented 6 years ago

I have made some discovers. The short story is it is very likely it is not going to work in big site. JS is not running quick enough to kill images. It works in your example is because the critical rendering path is very short and contents are served in HTTP/1.1 server. The long story is long but I don't have time to explain much now.