ftlabs / perf-widget

22 stars 2 forks source link

Supressing Widget on sites that can't be analysed #75

Closed seanmtracey closed 8 years ago

seanmtracey commented 8 years ago

Issue

Some sites exist on the .ft.com domain, but are only accessible from our internal network (such as Stash). The performance widget will try and run automatically because we domain matches *.ft.com but the external providers we use to generate insights will never be able to access the web page, leaving us with a perf widget forever displaying 'Analysing'

Steps to reproduce

Go to any website on the ft.com domain that is only accessible on the internal network. P-widge will start automatically, but never resolve.

Proposed solution/s

1) A blacklist of internal FT services that we don't want to automatically run the p-widge on 2) If we don't want a list of internal FT services existing out in the world, we could have a timeout on the insights generation, and display a message in the widget saying we couldn't analyse the page. 3) Playing Devils Advocate, we shouldn't actually do this: We could expose the internal FT network to our Heroku service and allow the 3rd party insight providers to access the services using our Heroku app as a proxy

AdaRoseCannon commented 8 years ago

Just changed whitelist -> blacklist

AdaRoseCannon commented 8 years ago

But I think a time out (2) maybe a good idea, instead of having the extension constantly ping the server from now until kingdom come, instead ping for 2 mins then say, 'the server appears appears to be under heavy load right now try again soon. [Big Try Again Button]' which would trigger a non-fresh refresh.

I'll mock one up and do a PR.

JakeChampion commented 8 years ago

I am not in favour of a blacklist approach. To handle internal websites we could have our server do a ping check to see if the website is visible to it, if the server can see the website then I think it is safe to assume the third party providers can see the website as well.

With regards to Next.ft.com redirecting to ft.com when being tested by Google's PageSpeed Insights, I think this is an issue for Next.ft.com to fix. Possible solution would be to sniff the user-agent and check if it is Google's PageSpeed Insights and let it pass through.

railsagainstignorance commented 8 years ago

Hm. Fence-sitting. I can see the benefit of having a blacklist, providing a robust/quick means of avoiding a known problem site/page (perhaps there is a site which breaks PW in UX or system-affecting nightmare kind of way, and we intend to fix it, but not yet). But I prefer the ping check as the default approach (unless overridden by the blacklist).

AdaRoseCannon commented 8 years ago

The blacklist is nice because we can provide a reason it is blacklisted. E.g. the reason next is blacklisted is actually pretty subtle.

I think there is no reason not to have both. So we can get reasoned error messages for those we know don't work and more general error messages if we detect it won't work.