steevenakintilo / TwitterGiveawayBot

A bot that make you participate to any twitter giveaway
Apache License 2.0
18 stars 3 forks source link

Error Chromedriver #2

Closed Rana-0003 closed 1 year ago

Rana-0003 commented 1 year ago

Last time i was stuck with snscrape since i solved that issue with upgrading python 3.6 to 3.9 now I'm stuck again with this traceback call.

When u asked to unzip cromedriver to mention code path, i just put it into twittergiveawaybot folder. Did i make mistake there? Otherwise everything is fine.

Traceback (most recent call last): File "d:\TwitterGiveawayBot-main\main.py", line 1, in
from twiiiiter import * File "d:\TwitterGiveawayBot-main\twiiiiter.py", line 21, in class Scraper: File "d:\TwitterGiveawayBot-main\twiiiiter.py", line 26, in Scraper driver = webdriver.Chrome(executable_path="chromedriver", options=options) # to open the chromedriver
TypeError: init() got an unexpected keyword argument 'executable_path'

steevenakintilo commented 1 year ago

I know how to fix it just remove the chromedriver part see screenn Screenshot_20230622-155738_Chrome.jpg

Rana-0003 commented 1 year ago

Thanks it's working fine now. But I'm targeting specific giveaways so it's giving me so many empty attemps otherwise i love the work u"ve done here :)

Here's the Final Results !!!!

DevTools listening on ws://........................ Hello world Inside main one Starting the program Searching for Giveaway Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages Stopping after 20 empty pages

steevenakintilo commented 1 year ago

goood happy for you mate the Stopping after 20 empty pages "error" is normal I've checked on snscrape github it's just a prevention error.

I'm happy that you use my bot and find it good :)