feediron / ttrss_plugin-feediron

Evolution of ttrss_plugin-af_feedmod
https://discourse.tt-rss.org/t/plugin-update-feediron-v1-2-0/2018
MIT License
212 stars 34 forks source link

TTRSS doesn't fetch ironed feed #187

Closed verlen64 closed 1 year ago

verlen64 commented 1 year ago

I am running latest version of ttrss-docker and feediron plugin. Both are working, but in some strange way don't communicate.

Do you have any idea, what causes these issues and how to solve them?

ttrss- Error log shows messages like:

E_WARNING (2) | plugins.local/feediron/init.php:705 | Trying to access array offset on value of type null 1. plugins.local/feediron/init.php(705): ttrss_error_handler(Trying to access array offset on value of type null, plugins.local/feediron/init.php) 2. classes/pluginhandler.php(15): add() 3. backend.php(144): catchall(add)  Real IP: 5.28.xxx.xxx Forwarded For: 5.28.xxx.xxx Forwarded Protocol: https Remote IP: 192.168.xxx.xx Request URI: /backend.php User agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101 Firefox/102.0
{
    "name": "heise.de",
    "url": "heise.de|telepolis.de",
    "stamp": 1599066467,
    "author": "uqs",
    "match": "heise.de|telepolis.de",
    "config": {
        "type": "xpath",
        "xpath": "article",
        "reformat": [
            {
                "type": "regex",
                "pattern": "\/\\.html\/",
                "replace": ".html?seite=all"
            }
        ],
        "modify": [
            {
                "type": "regex",
                "pattern": "\/<.?noscript>\/",
                "replace": ""
            },
            {
                "type": "regex",
                "pattern": "\/<.?a-(lightbox|bilderstrecke)[^>]*?>\/",
                "replace": ""
            }
        ],
        "cleanup": [
            "h1[contains(@class,'a-article-header__title')]",
            "div[contains(@class,'a-article-header__service')]",
            "figure[@class='branding']",
            "a-collapse[contains(@class,'a-box--collapsable')]",
            "a-teaser",
            "a-ad",
            "aside",
            "a-script",
            "a-img",
            "footer"
        ]
    }
}
dugite-code commented 1 year ago

Do you have any ad-block like pi-hole? I suspect you might have issues resolving dns queries, that error regarding the Github API suggests to me it's just unable to reach Githubs servers as it has no specific errors.

The reason you filters are not working is because the config files are not a 1:1 copy paste from the main config file. You example should look like this:

{
    "heise.de|telepolis.de": {
        "type": "xpath",
        "xpath": "article",
        "reformat": [
            {
                "type": "regex",
                "pattern": "\/\\.html\/",
                "replace": ".html?seite=all"
            }
        ],
        "modify": [
            {
                "type": "regex",
                "pattern": "\/<.?noscript>\/",
                "replace": ""
            },
            {
                "type": "regex",
                "pattern": "\/<.?a-(lightbox|bilderstrecke)[^>]*?>\/",
                "replace": ""
            }
        ],
        "cleanup": [
            "h1[contains(@class,'a-article-header__title')]",
            "div[contains(@class,'a-article-header__service')]",
            "figure[@class='branding']",
            "a-collapse[contains(@class,'a-box--collapsable')]",
            "a-teaser",
            "a-ad",
            "aside",
            "a-script",
            "a-img",
            "footer"
        ]
    }
}
verlen64 commented 1 year ago

Thanks a lot for your fast reply, @dugite-code! This solved my second issue in 10 seconds. The reason was in fact I had omitted those first lines after pasting the script to testing tab as you prescribed in your Readme. But I didn't omit them in the configuration tab, because I assumed them to be just comments. Strangely even the first issue didn't occur again. Maybe I touched some pull limit of Github the other day. Thus, I suggest to close this issue completely.