huaying / instagram-crawler

Get Instagram posts/profile/hashtag data without using Instagram API
MIT License
1.14k stars 283 forks source link

elenium.common.exceptions.ElementClickInterceptedException #90

Open gyunggyung opened 4 years ago

gyunggyung commented 4 years ago

I use that python crawler.py hashtag -t 연습 -o ./output -n 15

DevTools listening on ws://127.0.0.1:8393/devtools/browser/a7d1144b-bb0e-45c4-8249-0aea43743ed7
Traceback (most recent call last):
  File "crawler.py", line 96, in <module>
    get_posts_by_hashtag(args.tag, args.number or 100, args.debug), args.output
  File "crawler.py", line 42, in get_posts_by_hashtag
    ins_crawler = InsCrawler(has_screen=debug)
  File "C:\Users\hwnau\Desktop\test\instagram-crawler-master\inscrawler\crawler.py", line 70, in __init__
    self.login()
  File "C:\Users\hwnau\Desktop\test\instagram-crawler-master\inscrawler\crawler.py", line 87, in login
    login_btn.click()
  File "C:\Users\hwnau\.conda\envs\cuda\lib\site-packages\selenium\webdriver\remote\webelement.py", line 80, in click
    self._execute(Command.CLICK_ELEMENT)
  File "C:\Users\hwnau\.conda\envs\cuda\lib\site-packages\selenium\webdriver\remote\webelement.py", line 628, in _execute
    return self._parent.execute(command, params)
  File "C:\Users\hwnau\.conda\envs\cuda\lib\site-packages\selenium\webdriver\remote\webdriver.py", line 312, in execute
    self.error_handler.check_response(response)
  File "C:\Users\hwnau\.conda\envs\cuda\lib\site-packages\selenium\webdriver\remote\errorhandler.py", line 242, in check_response
    raise exception_class(message, screen, stacktrace)
selenium.common.exceptions.ElementClickInterceptedException: Message: element click intercepted: Element <button class="sqdOP  L3NKy   y3zKF     " disabled="" type="submit">...</button> is not clickable at point (391, 243). Other element would receive the click: <div class="                    Igw0E     IwRSH      eGOV_         _4EzTm    bkEs3                          CovQj                  jKUp7          DhRcB                                                    ">...</div>
  (Session info: headless chrome=81.0.4044.138)
gabrofig commented 4 years ago

same here

Feywell commented 4 years ago

same question

hyemmie commented 4 years ago

same error

uniglot commented 4 years ago

I ran into the same problem, but after reading some code I've figured out that it's because I didn't provide the auth information of mine. There are many options to provide your account info, but I filled them in to 'inscrawler/secret.py' or you can export them as environment variables.