MatthewChatham / glassdoor-review-scraper

Scrape reviews from Glassdoor
BSD 2-Clause "Simplified" License
180 stars 252 forks source link

No Such Element Exception #46

Open bartels50642 opened 4 years ago

bartels50642 commented 4 years ago

Hi All,

I'm trying to run this code as of Oct 2020, but I keep running into the "No Such Element Exception" error. Any help here would be GREATLY appreciated! Here is the error that runs:

2020-10-26 16:52:48,040 INFO 3 :(21411) - Scraping up to 25 reviews. 2020-10-26 16:52:48,040 INFO 3 :(21411) - Scraping up to 25 reviews. 2020-10-26 16:52:48,040 INFO 3 :(21411) - Scraping up to 25 reviews. 2020-10-26 16:52:48,040 INFO 3 :(21411) - Scraping up to 25 reviews. 2020-10-26 16:52:48,040 INFO 3 :(21411) - Scraping up to 25 reviews. 2020-10-26 16:52:48,040 INFO 3 :(21411) - Scraping up to 25 reviews. 2020-10-26 16:52:48,096 INFO 2 :(21411) - Signing in to bartels.k@northeastern.edu 2020-10-26 16:52:48,096 INFO 2 :(21411) - Signing in to bartels.k@northeastern.edu 2020-10-26 16:52:48,096 INFO 2 :(21411) - Signing in to bartels.k@northeastern.edu 2020-10-26 16:52:48,096 INFO 2 :(21411) - Signing in to bartels.k@northeastern.edu 2020-10-26 16:52:48,096 INFO 2 :(21411) - Signing in to bartels.k@northeastern.edu 2020-10-26 16:52:48,096 INFO 2 :(21411) - Signing in to bartels.k@northeastern.edu

NoSuchElementException Traceback (most recent call last)

in 44 45 if __name__ == '__main__': ---> 46 main() in main() 7 8 ----> 9 sign_in() 10 11 if not args.start_from_url: in sign_in() 7 # import pdb;pdb.set_trace() 8 ----> 9 email_field = browser.find_element_by_name('username') 10 password_field = browser.find_element_by_name('password') 11 submit_btn = browser.find_element_by_xpath('//button[@type="submit"]') /opt/anaconda3/lib/python3.7/site-packages/selenium/webdriver/remote/webdriver.py in find_element_by_name(self, name) 494 element = driver.find_element_by_name('foo') 495 """ --> 496 return self.find_element(by=By.NAME, value=name) 497 498 def find_elements_by_name(self, name): /opt/anaconda3/lib/python3.7/site-packages/selenium/webdriver/remote/webdriver.py in find_element(self, by, value) 976 return self.execute(Command.FIND_ELEMENT, { 977 'using': by, --> 978 'value': value})['value'] 979 980 def find_elements(self, by=By.ID, value=None): /opt/anaconda3/lib/python3.7/site-packages/selenium/webdriver/remote/webdriver.py in execute(self, driver_command, params) 319 response = self.command_executor.execute(driver_command, params) 320 if response: --> 321 self.error_handler.check_response(response) 322 response['value'] = self._unwrap_value( 323 response.get('value', None)) /opt/anaconda3/lib/python3.7/site-packages/selenium/webdriver/remote/errorhandler.py in check_response(self, response) 240 alert_text = value['alert'].get('text') 241 raise exception_class(message, screen, stacktrace, alert_text) --> 242 raise exception_class(message, screen, stacktrace) 243 244 def _value_or_default(self, obj, key, default): NoSuchElementException: Message: no such element: Unable to locate element: {"method":"css selector","selector":"[name="username"]"} (Session info: chrome=86.0.4240.111)
sachinchaturvedi93 commented 4 years ago

https://github.com/sachinchaturvedi93/glassdoor-review-scraper

Try this out, I haven't updated this in a while though. Let me know if I could help.

bartels50642 commented 4 years ago

Hi Sachin,

Thanks for the speedy reply! I tried your code and it looks like part of it is working (Chrome opens, goes to Glassdoor, passes through login, and opens the "Premise Data Corporation" landing page) but I still get an error in python like this:

NoSuchElementException: Message: no such element: Unable to locate element: {"method":"css selector","selector":".paginationPaginationStylepage.paginationPaginationStylecurrent"} (Session info: chrome=86.0.4240.111)

Any ideas what could be wrong?

sachinchaturvedi93 commented 4 years ago

I guess the element isn't loading. Try again or try adding wait time. I'm sure that's the name of the current page class. I checked the source code too.

bartels50642 commented 4 years ago

Hrm okay, how would I go about adding wait time? Sorry I am new to scraping...

bartels50642 commented 4 years ago

Should I change the "time.sleep(1)" to "time.sleep(500)" (or some other number that is greater?

Cangelelli commented 3 years ago

@bartels50642 Did you ever figure it out? I am having the same issue with the page-turner.