If you have used LinkedIn, you must have encountered many posts regarding helpful resources, and they would ask email address,to which replies in comments are usually like
Interested!
<email-address>
I don't like this thing, like you can already share link in the post itself :angry:. Nevertheless, I thought to automate this work of collecting all emails.
All the comments, with columns of
are stored in a csv.
Install the dependencies:
pip install -r requirements.txt
In config.json
, enter the required url of LinkedIn Post in post_url variable:
post_url = ""
NOTE: If you forget to enter here, it will be asked during execution of script itself.
You can also change csv file name (in which scraped data will be stored) and dir name (in which profile pics will be downloaded) in config.json
.
Help:
usage: main.py [-h] [--headless] [--show-replies] [--download-pfp]
Linkedin Scraping.
options: -h, --help show this help message and exit --headless Go headless browsing --show-replies Load all replies to comments --download-pfp Download profile pictures of commentors
> *__NOTE__*: Even if the flag `--download-pfp` isn't provided, URLs of image would get stored in the output csv.
- Run the script:
```bash
python main.py
Login email and password for your LinkedIn account will be asked and process would start.
config.json
contains various fields, containing information about scraping the HTML elements by name or xpath, and other metadata
Suggestions and contributio ns are alwasy welcome!:smile: