Closed TheFrenchGhosty closed 4 years ago
back in september some developer who hosts an alternative snopyta frontend on github servers predicted that on 20th of october his front end will stop working because of a crippling issue that would happen to snopyta and other instances - i wonder if you expected this as well
i was just about to create an issue about this
Can confirm on my instance too. I use huginn to make custom RSS feeds from invidious. Noticed my news aggregator was feeling empty today, investigated, and found this error.
back in september some developer who hosts an alternative snopyta frontend on github servers predicted that on 20th of october his front end will stop working because of a crippling issue that would happen to snopyta and other instances - i wonder if you expected this as well
Well this is weird. Did he say what the problem will be?
I've tried looking into this yesterday. The log doesn't show anything beyond a 500 status code, and even recompiling Invidious with the logger being set to "DEBUG" level by default didn't show anything more. I also haven't seen a "debug" flag on the command-line or anything.
Regardless of how this is resolved (I'm sure it's a minor change), I think the error handling (or lack thereof) leaves a lot to be desired, especially for a project you expect will break frequently due to YouTube changes and should be improved so that future problems are easier to debug.
Is there anything preventing a proper stack trace from being displayed? In Python land this would be the default behavior unless you explicitly handle the exception (which you shouldn't be doing unless you know how to work around the error), but is the fact that Crystal being a compiled language preventing it from displaying a full stack trace (as in you have to handle all exceptions, even if only displaying a useless error page otherwise it won't compile)?
I haven't tested it on an actual instance, but this appears to come from get_about_info in channels.cr. Looking at the HTML of a random about page from youtube, some nodes Invidious is looking for via xpath are missing, such as the span with class="qualified-channel-title-text"
.
The check in line 800 does not appear to catch the case when the xpath_node(..) result is nil, hence the not_nil!
in line 806 raises the NilAssertionError.
It looks like get_about_info
needs an update for some Youtube changes. At least that's my theory without any thorough testing.
I've reproduced the request that Invidious would do manually and it seems like the "disable_polymer" parameter might not be working? I'm getting an HTML skeleton page (intended to be "hydrated" by the front-end JS?) but all the data Invidious is after seems to be in some JSON objects as well as meta tags, thus the current XPaths would all fail.
I wonder if this issue in youtube-dl is related to the same problem.
Thank you, updated, and it now works
YouTube changed something (again), and channel now trigger "Nil assertion failed".
Edit: this also break RSS.
Link used: https://invidious.snopyta.org/channel/UCXuqSBlHAE6Xw-yeJA0Tunw
Like #1422: This affect every instance, public or private, with a lot of users or not, with anti-captcha setup or not.
Related #1422