Open shawneenc opened 2 years ago
Hi @shawneenc I think you should try to update more often. In the Docker container 11.0.0 Chrome 78 is used (and Firefox 70) and a lot of things changed in the browser since then. For the projects I test I try to be update with the current stable browser version so it matches most of what my users use.
If you want to check what's changed in sitespeed.io since 11.0.0 you can do that in the changelog: https://github.com/sitespeedio/sitespeed.io/blob/main/CHANGELOG.md
If you want to see what changed in Chrome since 78 I think the best way is to check the Chromium commit log.
@soulgalore is there any way to run a version of sitespeed with a new version of chrome from docker? With each version run (from docker) the version of chrome that was available at the same time as that version of sitespeed gets pulled and not the newest version
You need to build the container yourself then. You can base that container on https://hub.docker.com/r/sitespeedio/webbrowsers/tags (the tags says which browser version that exist in the container).
In your metrics, the first visual change of 167 ms seems really fast, I think that maybe been broken, if you look at the video/screenshot do that metric look correct? You can check the same and see if you can spot what's last visual change (to see if its correct or not).
One thing that I remember changed (but I haven't checked the changelog) is that a couple of years ago we added 3 extra seconds on when to end the test. Before it was loadEventEnd + 2 s and then loadEventEnd +5s, that could pickup more requests.
Hi @soulgalore,
Can you please explain why sitespeed sometimes takes the last visual change metric at a later point? I.e. in some instances it is taking it along with the visual completeness metrics and other times it takes it at a later stage. The screenshots have been included above. Do you know why sitespeed does this? Have you encountered any issues like this before?
Last Visual Change comes from the Visual Metrics script that analyses the video. Easiest is to check the video and filmstrip and see what that last change actually is. Sometimes it can be hard, we have tuned the number of pixels through the years of how many that needs to be changed to be counted last change. If you find something that doesn’t match, it would be great if you can create an issue for that in https://github.com/sitespeedio/browsertime. There’s a configuration that you can use:--browsertime.videoParams.keepOriginalVideo
that will keep the original video (the original video from the screen), it will be named something with the -original. Attach that to the issue, then its easy to reproduce.
By screenshot I mean the screenshot from the video "filmstrip":
Hi @soulgalore,
On further investigation, it looks like with the newer version of sitespeed (24.0.0), the viewport size passed in within the docker command is getting ignored and is waiting for the full page to render/load, as opposed to the viewport screen only.. See filmstrips below:
version 24.0.0:
version 11.0.0 - correct:
We cannot switch to a new version with this issue. It will throw all of our metrics off, as we comapare browser render times sprint over sprint. We use last visual change for this
Ah I see now, thanks. In your example you use--browsertime.viewPort 1366X768
the correct way according to the docs is to use a lower case x --browsertime.viewPort 1366x768
- it seems like a couple of years ago (four?) the capital X also worked but I don't have any memory of that it should, I wonder if that was just by accident. Running --help
in 11.0.0 also shows a lower case x, right?
Hi @soulgalore when we run from docker - this is the command we use, which has the lower case 'x':
docker run --shm-size=2g -v /home/ubuntu/workspace/rowser-rendering_shawneen_branch:/sitespeed.io sitespeedio/sitespeed.io:24.0.0 --outputFolder output --speedIndex true --video false --config ./config.json --multi ./tests/lp-multi-v3_cp.js --browsertime.url https://url/?performancetest=sitespeed --block js-agent.newrelic.com --block www.google-analytics.com --block www.googletagmanager.com --block bam.nr-data.net --browsertime.cacheClearRaw --browsertime.login performance --browsertime.viewPort 1366x768 --spa --budget.output junit -n 1
I did a run including the following to capture the original video:
--browsertime.videoParams.keepOriginalVideo true ${url} -n 1 --video --visualMetrics
This produced the following:
Original Video Filmstrip (v24) - page did not finish loading before last visual change was captured:
Normal filmstrip (v24)
Sorry I'm slow, I don't follow what you show me here? So using lower case x you get the correct view port right?
no @soulgalore I get the results that are displayed above when using lower case 'x' --browsertime.viewPort 1366x768
. It looks like in the more recent versions of sitespeed the viewport size gets ignored and it waits for a full page to render and not the viewport sized screen
Yeah but I don't see any difference in the images? Easiest is to try with something small like --browsertime.viewPort 200x200
and see what it looks like. When I tried yesterday it worked, when I tried with X on 11.0.0 it surprisingly worked too.
When tests ends depends on JavaScript that runs in the browser checkout https://www.sitespeed.io/documentation/sitespeed.io/browsers/#choose-when-to-end-your-test
We run these scripts against 3 separate environments. Within 1 of the environments it worked last week, this week however the exact same script, ran with the exact same parameters is showing the behaviour outlined above. The other 2 environments we run the test against have always showed the above behaviour
Looks like the later versions of sitspeed are flaky
the behaviour outlined above
Do I understand correct that you are seeing that the last visual change do not end when something is painted on the screen instead it picks up the scrollbar? If that's the case could you please run with --browsertime.videoParams.keepOriginalVideo
and then instead of attach the screenshots, attach the video, then I can try to reproduce it. If you look in the video folder you will have a file named 1-original.mp4.
With that original recording I can run visual metrics locally that gets the metrics and see what's going on.
Hi @soulgalore,
Both videos attached - the first is the actual incorrect results and the 2nd is the orignal video
2nd - original video: https://user-images.githubusercontent.com/55998417/175037319-ffe664b0-2bea-4e96-b23a-5d1ac0dbcd27.mp4
Are you sure that's the original video, that should always start with some orange frames and do not include any timing metrics? Checkout https://github.com/sitespeedio/sitespeed.io/issues/3677#issuecomment-1163058795 on how to get that.
@soulgalore did a run locally also - these are the videos from that run:
Great, I'll have a look tonight.
Just to be totally clear, which of these two are wrong? Looking at the metrics it both look rights, just one that loads slower? Did you check the TTFB for both?
with the latest set of videos sent from the local run - these are all original videos.
The comment before that - which had 2 videos - the first video is displaying incorrect last visual change
I need to have the original video from the run that gets incorrect visual change, if I have that I can try to reproduce it.
@shawneenc one thing you could try is to use the latest version of visual metrics, we have a version that @gmierz has been working on that we gonna make default later this year, maybe that will pick up the latest change better. If you use the Docker container you can enable that with --browsertime.visualMetricsPortable true
.
Else I need the original video from the run that fails, that's the only way for me to hava look and see if I can fix it.
Hi @soulgalore the link to download the original video does not work from the results produced from the local run. I can send you the original video from a run within jenkins. However this will not be the exact same run the videos above came from. Jenkins also does not produce the exact set of videos you needed above (i.e. no orange strip an beginning of run) ..... It is still the same page and measurement though. Included video from jenkins below:
Yes there's no need to HTML link to the original video, so you need to copy it from the Jenkins or from where you upload the result. I need the original video, it works like this:
To be able to reproduce I need the original untouched video.
@soulgalore from the local run - there is no link to download/copy the actual original video. From jenkins the originally requested 'original' video link is not the correct one you require.
The video I have sent you is the only video I have for the original video you are requesting now. Are you saying this is not the correct video?
Yes you need to supply the original (named original video) from the run where you see that something is wrong. See my comment in https://github.com/sitespeedio/sitespeed.io/issues/3677#issuecomment-1163058795 and the actual screenshot of which video.
@soulgalore I know what you are asking for. Can you understand what I am trying to explain?
There are issues producing what your are asking for running both locally and from jenkins - a different issue with each one:
Do you understand what I am saying here?
I'll try :)
From jenkins the 'original' videos from this: browsertime.videoParams.keepOriginalVideo with the orange strip are not produced
You mean it's not created so there's a bug? Or how do you mean not produced? Or you cannot change the configuration?
the original video is not produced as a stand alone file .. . Is there a way to store this video through a command prompt like
Can't you add --browsertime.videoParams.keepOriginalVideo
? Or how do you run it locally?
video from this (--browsertime.videoParams.keepOriginalVideo) is produced (I have sent this). You are also looking for another video though right?
I need the original video where the orange frame is still there and there's no text added from where you test and you see that the last visual change is wrong. I can see two videos that is attached that has the orange frames but they do not have any late last visual change. Including videos that do not have the issue with late visual change will not help, screenshots will not help. Original video from when you see metrics that are off is the one I need.
The original video is named 1-original.mp4 from the first run, 2-original.mp4 from the second run etc.
@soulgalore
Video generated from --browsertime.videoParams.keepOriginalVideo: https://user-images.githubusercontent.com/55998417/175292737-0c34e1ac-02ea-47c3-a0cc-18588788cb10.mp4
In that video it doesn't look like the ones you explained as in https://github.com/sitespeedio/sitespeed.io/issues/3677#issuecomment-1161788448 ? If you look at the video, at the end "What would you like to learn today" is changed by a "loading" and that is the last visual change. Or what do you mean is the last visual change?
@soulgalore that is the video that is generated from passing in this command : --browsertime.videoParams.keepOriginalVideo
@soulgalore the problem is that the newer version of sitespeed is taking the last visual change at a place where and when it should not. I.e. it waits for the whole page to load (a page that is out of range of what is displayed on the screen and for what has been set as the viewport). You can see this from the filmstrips supplied - the scrollbar gets shorter and shorter with every later screenshot until the page is fully loaded. Is there nothing else you can do here to fix this?
This metric is incorrect and we cannot switch to a version with this behaviour and report on a metric that is incorrect
You can see this from the filmstrips supplied - the scrollbar gets shorter and shorter with every later screenshot until the page is fully loaded. Is there nothing else you can do here to fix this? Is there nothing else you can do here to fix this?
Yes there is if you can please please supply the original video from when that is happening, I don't see that in the videos you add. If you can add that video where that happens, then I can probably help. If you add other videos that do not have that symptom its hard to use.
it waits for the whole page to load
I don't think that is the problem because that's the way it has always worked? It waits until onloadEventEnd fires + 5 seconds. It could be that the old versions used 2 seconds, if your page do Ajax loads that happens after load event end that could then be a thing, if that's the case you can change when to end the tests see https://www.sitespeed.io/documentation/sitespeed.io/browsers/#choose-when-to-end-your-test
@soulgalore
_"You can see this from the filmstrips supplied - the scrollbar gets shorter and shorter with every later screenshot until the page is fully loaded. Is there nothing else you can do here to fix this? Is there nothing else you can do here to fix this?
Yes there is if you can please please supply the original video from when that is happening, I don't see that in the videos you add. If you can add that video where that happens, then I can probably help. If you add other videos that do not have that symptom its hard to use."_
I am sending you the video captured by: --browsertime.videoParams.keepOriginalVideo (that you asked for), which does not capture the incorrect behaviour.
Are you saying now that generating a video from: --browsertime.videoParams.keepOriginalVideo should capture the incorrect behaviour? As this is not the case
The screenshots are too small so its hard for me to see exactly what's going on. Either it could be that the scrollbar is picked up, some late content appear on the screen or just a couple of pixel change happens. If I have the original video or a way to reproduce I can have a look. If I can reproduce the same, I can also try to fix it and then verify that it works.
We have instructions on how to make a reproducible issue: https://www.sitespeed.io/documentation/sitespeed.io/bug-report/#explain-how-to-reproduce-your-issue - if there's page that aren't public, you can try to reproduce the issue on a public site.
In those instructions you also have:
Is there a problem with the video or the metrics from the video? Then make sure to enable the full original video so you can share that with us, do that by adding --videoParams.keepOriginalVideo to your run. Look in the video folder for that URL and you will see a video named 1-original.mp4. Please share that video with us, then we can more easily reproduce/understand the problem.
Here. it is important that you share the video from where it fails, not another video because that will not help. Please share the video from where it fails.
Are you saying now that generating a video from: --browsertime.videoParams.keepOriginalVideo should capture the incorrect behaviour? As this is not the case
Yes the video need to capture the incorrect behaviour.
Could it be worth removing the scrollbar to see if that's what's being picked up? There's a Chrome switch for that:
docker run --rm -v "$(pwd):/sitespeed.io" sitespeedio/sitespeed.io:25.2.1 --chrome.args "enable-features=OverlayScrollbar" https://www.sitespeed.io -n 1
@shaqb cool! To be able to reproduce and verify that a fix works, reproducing/video is really helping.
Thanks @shaqb - needed to be passed in as: --chrome.args 'enable-features'='OverlayScrollbar' ${url} -n 1
This did produce results without the scroll bar, however sitespeed continues to wait for the whole page to load
cc @soulgalore
Here is the original video produced during this run where there is no scrollbar, but sitespeed continues to load the page:
Screenshot in stages - so that they are readable and not too small:
We expect / need for the last visual change to stop around where the visual completeness metrics are captured. When the page that is visible to the user and what has been set in the viewport has been loaded only
Where you able to try https://github.com/sitespeedio/sitespeed.io/issues/3677#issuecomment-1164030274 ? Add --browsertime.visualMetricsPortable true
add run the rest as the same, then it use the coming version of visual metrics.
For the screenshots, it's hard for me if there's some change between 6.2 (fully loaded) and 6.7 (last visual change). Depending on which font websites uses, I've seen that Chrome could do final anti-aliasing on Linux late, but that is hard for me to see? Can you spot any difference?
Hi @soulgalore - yes the run above was run with --browsertime.visualMetricsPortable true
. Included individual screenshots below:
Fully loaded @ 6.2 seconds:
Last visual change @ 6.7 seconds:
The time that last visual change should have been captured at:
The page is fully loaded on the screen at 3 seconds. Sitespeed should not continue to wait for the remainder of the page to load, which is not within the viewport setting, or what users actually see on the screen initially
Ok cool.
--browsertime.videoParams.keepOriginalVideo
, so I can reproduce and potentially fix the problem + see what's going on.@soulgalore the video is included in this comment: https://github.com/sitespeedio/sitespeed.io/issues/3677#issuecomment-1165742311
@soulgalore is there a way to force the newer version of sitespeed (24.0.0) to run against an older version of chrome through the docker command? Have searched and cant find anything that states that this is possible.
Checked this area : https://www.sitespeed.io/documentation/sitespeed.io/browsers/#add-your-own-chrome-args. Mentions that an arg can be passed in, i.e. like : -b chrome
. Can I set the version though?
Want to run a test with sitespeed v24 and an older version of chrome to see if the issue could be related to the version of chrome. Thanks
When I run with --browsertime.visualMetricsPortable true
with that video I get FirstVisualChange 233 ms and LastVisualChange 2266 ms, that do not match your screenshots. Are you sure you use a rather new version of sitespeed.io when you did that test? Latest release is 25.3.1.
Chrome do not support that you give Chrome a specific version and then it will download that version. You need to build each container yourself if you need to test old version as in https://github.com/sitespeedio/sitespeed.io/issues/3677#issuecomment-1157640691.
@soulgalore what is the full docker command you are using to run? I used 24.0.0 for that test
There's no way for me to run directly in Docker as a test since you haven't provided a reproducible test case so I run visual metrics directly. There's been fixes in that script so please try with latest sitespeed.io. Thanks.
Same problem exists with versions after 20.0.0
On Fri, 1 Jul 2022, 17:46 Peter Hedenskog, @.***> wrote:
There's no way for me to run directly in Docker as a test since you haven't provided a reproducible test case https://www.sitespeed.io/documentation/sitespeed.io/bug-report/#explain-how-to-reproduce-your-issue so I run visual metrics directly. There's been fixes in that script so please try with latest sitespeed.io. Thanks.
— Reply to this email directly, view it on GitHub https://github.com/sitespeedio/sitespeed.io/issues/3677#issuecomment-1172533558, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANLHPUPFCBSMOWHLQGTGGJDVR4OHDANCNFSM5Y3ODVKQ . You are receiving this because you were mentioned.Message ID: @.***>
It's strange because when I test with --browsertime.visualMetricsPortable true vs running without it I get different metrics from the video. Let me make sure we do a log entry when that parameter is used so its easy to understand if it works or not.
We switched to the new portable script as default a year ago. Do you know if this is still a problem?
Hi @soulgalore,
We are currently on version 11.0.0 of sitespeed and looking to upgrade to a newer version. We have attempted this before, but have constantly come up against many issues with the newer versions. I.e. major differences in how sitespeed is measuring the rendering and major differences in the metrics produced.
Example when testing with version 24.0.0 vs 11.0.0. First visual and Last Visual change metrics are way off, i.e. 1.6 seconds vs 9.6 seconds. Am running the exact same script against both versions locally using the following docker command:
docker run --shm-size=2g -v
pwd:/sitespeed.io sitespeedio/[sitespeed.io](http://sitespeed.io/)[:24](http://sitespeed.io:24/).0.0 --outputFolder assignment_output8 --speedIndex true --video true --multi ./tests/lp-multi-v3_cp.js --browsertime.url [https://perf-aws.front.develop.squads-dev.com](https://shawneentest-org-dev.com) --browsertime.login testTe --block js-agent.newrelic.com --block www.google-analytics.com --block www.googletagmanager.com --block bam.nr-data.net --browsertime.viewPort 1366X768 --spa --browsertime.iterations 1
. The same issue exists when I try running with versions from 18.0.0 upwards :11.0.0:
24.0.0:
Within the waterfalls for both runs, I can see that when running with version 24.0.0 - firstVisualChange and lastVisualChange measurements have been moved from where they were caught in version 11.0.0. Also 26 requests were hit in version 11.0.0 and only 23 hit in version 24.0.0. See diagrams below:
11.0.0:
24.0.0:
Please advise