I'm creating a scraper on google colab using Selenium but now doesnt work. Yes in the past but I dont know why now doesn't.
The code is:
#dependencies
!pip install selenium
!apt-get update
!apt install chromium-chromedriver
!pip install fake-useragent
from selenium import webdriver
from fake_useragent import UserAgent
#options to chromedriver
ua = UserAgent()
userAgent = ua.random
chrome_options = webdriver.ChromeOptions()
chrome_options.add_argument('--headless')
chrome_options.add_argument('--no-sandbox')
chrome_options.add_argument('--disable-dev-shm-usage')
driver = webdriver.Chrome('chromedriver',chrome_options=chrome_options)
chrome_options.add_argument('--user-agent="'+userAgent+'"')
When a I run this code, colab show the next error:
"Message: Service chromedriver unexpectedly exited. Status code was: 1"
Do you know any solution? I've looked for an answer on other related topics but nothing works for me
I found some answers here: How to install Chromium without snap?
"..was the result of an issue induced as the colab system was updated from v18.04 to ubuntu v20.04 LTS and with Ubuntu v20.04 LTS google-colaboratory no longer distributes chromium-browser outside of a snap package..."
but I'd like to use Google Colaboratory, its already possible?