I am using Python and Selenium on AWS Lambdas for crawling. I have updated Python to 3.11 and Selenium to 4.18.0, but then my crawlers stopped working. This is the code for Selenium:
import os
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.chrome.service import Service
def get_headless_driver():
options = Options()
service = Service(executable_path=r'/opt/chromedriver')
options.binary_location = '/opt/headless-chromium'
options.add_argument('--headless')
options.add_argument('--no-sandbox')
options.add_argument('--single-process')
options.add_argument('--disable-dev-shm-usage')
options.add_argument('--window-size=1920x1080')
options.add_argument('--start-maximized')
return webdriver.Chrome(service=service, options=options)
def get_selenium_driver():
return get_local_driver() if os.environ.get('STAGE') == 'local' else get_headless_driver()
I am using Python and Selenium on AWS Lambdas for crawling. I have updated Python to 3.11 and Selenium to 4.18.0, but then my crawlers stopped working. This is the code for Selenium:
This is the code for installing chrome driver:
I am getting this error:
Message: Service /opt/chromedriver unexpectedly exited. Status code was: 127
How should I fix this error? Should I also update the chromedriver and headless? What versions should I chose?