I-A-C / script.module.lambdascrapers

Scrapers Module for Exodus based add ons
61 stars 39 forks source link

Scraper Check 11/00/2018 #35

Closed jewbmx closed 5 years ago

jewbmx commented 5 years ago

Scraper Check 11-08-2018 (Double Check My Findings!) (Tested by hand in browser and 1by1 in my Scrubs addon.) (Droid s7 edge iowa, usa)

_enDbird

_DbirdBroke

_Dbird2Fix

_DbirdFixed

_DbirdWorked

jewbmx commented 5 years ago

Next Post is the normal en. Only checking the sources_ lambdascrapers folder and en only.

The Above post is folder en_DebridOnly

jewbmx commented 5 years ago

Scraper Check 11-08-2018 (Double Check My Findings!) (Tested by hand in browser and 1by1 in my Scrubs addon.) (Droid s7 edge iowa, usa)

_en

_NoJoy (All these failed to find anything for me.) (Most are probably broken, will look into em later.) (If ya dont wanna wait check my script.module.Scrubs for a fix.)

Will go deeper into the NoJoy pile tomorrow, my eyes hurt lol been in scraper hell for a few days straight.

host505 commented 5 years ago

Well done mate. Mind doing pr to include your fixed scrapers in ls? Or you want me to do it for you? Yoda team fixed a few on their last update (wrzcraft is one that comes to me right now) but I don't feel comfortable just copying them to ls. They did use our rlsbb fix thought - they actually used both fixes, domain prefix change and cfscrape import - no idea why.

jewbmx commented 5 years ago

NP was already checking mine after going thru my folder of broken ones lol. You can do the pr I dont really know my way around git that much lmao.

Didnt know yoda still got updates, guess I need to check my repos.

jewbmx commented 5 years ago

Wow i gotta click that update check button more often lol i was missing out on 13 updates and 1 bullshit one that i still wont click (suprm. LiveResolv.)

host505 commented 5 years ago

Whoa relax dude, sure there's just a couple ways to fix. I only saw the update you did yesterday how the hell should I suppose to know you did it earlier. Relax nobody tried to take credit for changing site prefix from you, but I sure didn't see your fix before applying mine. And yes, we're lucky to have some people fixing stuff, what's your point?

jewbmx commented 5 years ago

Quick 10min Check of the new yoda update. heres all the ones that dont find results for me. maybe they will see this lol.

-bob.py is still broken for me. -still sportin alluc too. -missing the settings for the pron account info which makes pron.py useless if it still works.

host505 commented 5 years ago

You're all messed up. I didn't use the cfscrape fix. I used the .to domain that has not yet implemented cloudflare. Jewbmx's solution was to utilize cfscrape on .ru domain. Both ways work. Yoda implemented both fixes: .to prefix and cfscrape, and I wondered why, because .to doesn't use cloudflare. As for your insults, I won't be bothered kiddo, you obviously have issues. And yes, I'm newb to coding (never claimed otherwise) but I've been around long enough to don't care what dipshits like you say.

host505 commented 5 years ago

Like I said, you obviously have issues. When did I implied I was special because I changed a fuckin domain prefix? Always a drama bitch as you always were.

host505 commented 5 years ago

@jewbmx he actually doesn't want to hide (much), he's too big of an attention whore to do so ;)

host505 commented 5 years ago

First of all, ls & ex redux didn't start as a 'team' project, ie the name wasn't decided by a team. Heck, I personally don't consider myself as part of any team, if I have to offer something I will. Personally I don't care much about ex redux, I use ls on my own add-on. And I don't even like the name also - I don't think λ would too, he would probably want his name to be vanished, after all the size that what he created raised to.

But what you're always doing is belittling every newb attempt. I didn't see any of the major devs that created all these to come at you when you created your monkey business.

host505 commented 5 years ago

@jewbmx your fixed debrid providers work fine, thanks. Bestmoviez brings few but good quality links. Moviesonline on the other hand brings a bunch of (mostly) dead oload links, and it always marks them as SD. Not sure if the site itself only has SD links. I've actually found other scrapers too that don't implement some sort of quality identification, and just hardcode quality: 'SD' or hd.

jewbmx commented 5 years ago

Yea i sort my scrapers a little different in my addon and placed moviesonline in my messy section lol it probably needs some tlc. Bestmovies i consider to be a good 1. I dont really like scrapers that pull more then 5 or 6 results.

bjgood commented 5 years ago

shellc0de69, Why don't you just go away, dude. At least host505 and jewbmx are contributing. They're coming up with some pretty good code...that works. As one of them said...you're just a troll. You're, obviously, not a nice person.

Like I said, why don't you just go away. We like "positive" attitudes around here.

And, giving yourself a thumbs up...Ha! Nice touch. :D.

SerpentDrago commented 5 years ago

@I-A-C , can you please delete / block / remove shellc0de69's messages , he is causeing a mess in a great thread about trying to fix these scrapers . (this is the same dude that caused issues on a4k . Pretty sure its blamo lol )

For anyone wanting to block someone go here > https://github.com/settings/blocked_users

Anyways , Great work guys @jewbmx Awesome job looking into all those scrapers thats alot of time !

For anyone else that reads this , lets just not respond back to the coder troll :) i'm done , i've said my peace !

SerpentDrago commented 5 years ago

@jewbmx All these changes are in your addon ? can i get a link ?

@host505 I think you understand more then i do whats going on here , wanna do the PR , I guess we should put non working ones in a folder ? and not just delete ?

jewbmx commented 5 years ago

And here i thought most my posts go unnoticed lol. As to cleaning this mess of a post dont gotta bother, im gonna be working on the nojoy ones later tonight and then that will be the last post before this is closed. Oh and to the troll im totally ripping about 4 scrapers from yoda so i dont have to fix em <3 fuckin freak

SerpentDrago commented 5 years ago

We need a built in scraper Checker . that auto disables scrapers not working . !

jewbmx commented 5 years ago

I most likely already have a few that are fixed in my addon but not all of em lol. My phones folder has a bit more done just need to finish my fix folder too lmao

jewbmx commented 5 years ago

You can get my stuff from clickin my profile on here and then the reg 'repo'

jewbmx commented 5 years ago

Use the icefilms style code that has hostchecker at the top lol fails to local host if i recall.

I just do a full check every month if not weekly.

SerpentDrago commented 5 years ago

Use the icefilms style code that has hostchecker at the top lol fails to local host if i recall.

I just do a full check every month if not weekly.

what do you mean by this ?

also the zip in your repo says its 3 days old . Was not sure if that was updated . its just zips

host505 commented 5 years ago

@host505 I think you understand more then i do whats going on here , wanna do the PR , I guess we should put non working ones in a folder ? and not just delete ?

Removing them from this list should be sufficient. Tbh I find the current architecture/settings set a little confusing. There is a 'default for current scraper' setting that only enables the aforementioned set, and then there's 'enable all', that enables all in the current sources folder? There are many duplicates/broken ones in there. They're not enabled by default, but user shouldn't have the ability to enable them all, even manually. So yes, separating broken/duplicates and ditching some settings is needed imo.

supremacy-addon commented 5 years ago

how about u take yoda scrapers out

I-A-C commented 5 years ago

Gone!!

supremacy-addon commented 5 years ago

cool

I-A-C commented 5 years ago

@host505, I just pushed a commit aligning the defaults to jewbmx's list. Now that I've dropped Yoda, I'm going through the settings.xml to remove their now obsolete scrapers from lambdascrapers.

I acknowledge the current architecture/settings is a bit confusing... too many options I guess. Just because you CAN 'enable all providers' doesn't necessarily mean that you SHOULD enable all. I'm going to remove the enable/disable providers lines from 'Modules' section. I do find the enable/disable all helpful in the 'foreign' and 'debrid' section and to a lessor degree, the 'providers' section also

host505 commented 5 years ago

Just because you CAN 'enable all providers' doesn't necessarily mean that you SHOULD enable all

Yes but users will do so. To be clear because it isn't to me: does this setting ('enable all providers for current module') suppose to enable all providers within the selected module's folder? And what's the difference (if any) with the Providers tab/Enable all providers? If these settings enable all providers within a source's folder, imo we should either clean the lambdascrapers/en folder (leave only what is in the 'default' list), or ditch these settings for this module. Shouldn't allow the user to enable the duplicates in any case.

I-A-C commented 5 years ago

To be clear because it isn't to me: does this setting ('enable all providers for current module') suppose to enable all providers within the selected module's folder?

Yes

And what's the difference (if any) with the Providers tab/Enable all providers?

This tab enables all Providers independent of module; meaning all of the Placenta, Incursion,(previously Yoda) scrapers are enabled.

I have a commit ready that would clean out the nonworking scrapers from the lambdascrapers/en folder. I just want to get setting.xml file cleaned up before I push that.

host505 commented 5 years ago

Ok, and where do you intent to put the broken/duplicate scrapers at? Within ls, a non-scrapable folder? Or maybe create a separate repository for them? They should remain somehow still available for people to fix them.

jewbmx commented 5 years ago

Id just remove all the old addon folders, i find em useful but dont wanna be checking all of those while also maintaining the main pile. Also for any broken ones you can just move them to a folder labeled anything ya want like _Down and then we can easily find em to fix later. Settings can stay since the item isnt in the right spot to be activated. Same for dupes _Spares but leave em out the settings and worry about them when they are needed

SerpentDrago commented 5 years ago

I'd honestly REALLY REALLY like redux to enable and use lambada scrapers by default . Users tend to have sooo many issues lately . and something was weird where you would have to enable all in the lambada settings to get it working . IDK ,

SerpentDrago commented 5 years ago

and i don't feel you should even have a module in lambada scrapers , a scraper set or whatever it should just be Lambada Scrapers , period

jewbmx commented 5 years ago

Heres a small portion of the nojoys

jewbmx commented 5 years ago

Cant help myself lol does anyone else see how stupid it is to copy & paste a lame lil ascii picture making fun of copy & pasters? Callin the kettle black 🖕

SerpentDrago commented 5 years ago

@jewbmx where do i get your latest fix's changes ? i like to be a bit more updated and bleeding edge then waiting on changes to the official repo for lambada (this )

https://github.com/jewbmx/repo/tree/master/zips/script.module.Scrubs says last updated 4 days ago ?

jewbmx commented 5 years ago

Scroll to the start of this post lol my bad i forgot to say where i put em.

jewbmx commented 5 years ago

iwatchflix - now broken lol was too slow on the warning.

darewatch - reported down by email, dont use it but looks like a ddos protection issue with cf

Looks like 2ddl is runnin smooth again without any issues. Tested every hour for 24, didnt ever click play for a final check tho

jewbmx commented 5 years ago

Pretty sure about 5 or more other scrapers needed fixed today but i didnt keep track of em also dont remember if ya even use em in this project.

jewbmx commented 5 years ago

Im starting to think we should leave this open and redo it every week or month so people can use it to point out scrapers that might need looked at too. Like right now if you were clicking away and seen that a usual scraper result isnt there it prob needs checked lol, or you can see the same 4 scrapers are always the last to finish, they prob need checked too even tho they could just be slow ones

doko-desuka commented 5 years ago

how about u take yoda scrapers out

@supremacy-addon I don't think that's fair of you, you have the primewire scraper from LS, you only removed the comments (you forgot one btw, "# In milliseconds."), and also the darewatch one. What are you protecting? What's the end goal? I don't think anybody really cares who's the author, they just care that it works.
I wouldn't mind one bit if all these addons shared each other's scrapers. We're stronger together and that sort of thing...

@jewbmx nice catch w/ darewatch, they added a cloudflare anti-DDoS step.
UniversalScrapers has a nice CF challenge solver, but I think Exodus also has one, the modules/cfdecoder.py

SerpentDrago commented 5 years ago

To be fair to supremacy , and Doko i agree with you by the way . I think having yoda as a MODULE to select was the major issue , not the fact that scrapers are mixed / whatever . It confuses users , who knows mybee they go complain to sup about yoda issues and they are just useing yoda modules in this addon .

I give the benefit of the doubt prob. but whatever

IMHO Lambda scrapers should be one module , All mashed together and regularly updated (basically whats starting to happen )

doko-desuka commented 5 years ago

That makes sense Serpent, I hope you're right.

jewbmx commented 5 years ago

I think the request was to remove the yoda collection, dont think the singles in the main folder mattered. And if they got removed just get new ones from eggman's Poached, Numbers, or even my addon lol. I personally dont think anyone should complain about a copied scraper especially now that everyone keeps copyin that boxset code that i think numbers made by hand lol

jewbmx commented 5 years ago

Also i didnt fix darewatch because im not a fan of it lol always had it disabled for me, think theres only 2 spots to adjust for cfscrape tho.

host505 commented 5 years ago

Hey, @jewbmx , do you have any other goods in your basket? I've been testing some of your fixed scrapers, I've cherrypicked some and want to make a pr with them on ls (if you don't mind). If you have some more, would you please upload them on the server from the first post? I've also checked your add-on, picked from there some as well.

Side question, do you know where the 1080P scraper originated from? Also, the version on your add-on, was it fixed by you or copied from Yoda? Thanks mate, keep'em coming!

jewbmx commented 5 years ago

Feel free to share em. As to 1080p its commented at the top lol fixed by me from Muad 'Dibs youtube video. The first copy came from placenta before the rewrite.

jewbmx commented 5 years ago

Make sure you compare em to scrapers named like em. People seem to be tossin my 'my' scrapers into their folders next to 3 of matching scrapers lol

host505 commented 5 years ago

Already have, I'll rename yours to match the names of the broken ones from ls, to avoid creating extra settings.

jewbmx commented 5 years ago

Anyone having issues getting scrapers to work? I been getting emails saying that every scraper that uses the ip_player code in its resolve section is broken. If I recall theres around 8 or more that use that code. Also the error wont be noticed unless someone trys to play a source from one of these scrapers.