omkarcloud / google-maps-scraper

πŸ‘‹ HOLA πŸ‘‹ HOLA πŸ‘‹ HOLA ! ENJOY OUR GOOGLE MAPS SCRAPER πŸš€ TO EFFORTLESSLY EXTRACT DATA SUCH AS NAMES, ADDRESSES, PHONE NUMBERS, REVIEWS, WEBSITES, AND RATINGS FROM GOOGLE MAPS WITH EASE! πŸ€–
https://www.omkar.cloud/
MIT License
863 stars 211 forks source link

Add Shell Arguments #139

Closed protwan closed 1 month ago

protwan commented 4 months ago

I propose that the application takes parameters via command line instead of hardcoding them in main.py. Also the output should be written to stdout (easily redirected to a file if needed).

I have changed a script a little bit for my use case. This is incomplete and just a quick glimplse for how I think it should look like.

from src.gmaps import Gmaps
import argparse
import sys

parser = argparse.ArgumentParser()
parser.add_argument("query", type=str)
parser.add_argument("-m", "--max-results", type=int, default=None)
parser.add_argument("--has-website", action="store_true", default=None)
parser.add_argument("--has-phone", action="store_true", default=None)
parser.add_argument("--fields", type=str, nargs="+", default="default")

args = parser.parse_args()
sys.stderr.write(str(args) + "\n")

Gmaps.places([args.query],
             max=args.max_results,
             has_website=args.has_website,
             has_phone=args.has_phone,
             fields=args.fields)

Use it like

python main.py --has-website --has-phone --fields name website phone owner address --max-results 100 "restaurants in paris" > "restaurants.json"

Benefits:

Chetan11-dev commented 4 months ago

We are working creating an API version which you may use with curl instead of shell arguments.

Chetan11-dev commented 1 month ago

Release new version with API Integration, Kindly run command

python -m pip install bota botasaurus_api botasaurus_driver bota botasaurus-proxy-authentication botasaurus_server --upgrade

and then run the below commands 1️⃣ Clone the Magic πŸ§™β€β™€οΈ:

git clone https://github.com/omkarcloud/google-maps-scraper
cd google-maps-scraper

2️⃣ Install Dependencies πŸ“¦:

python -m pip install -r requirements.txt && python run.py install

3️⃣ Launch the UI Dashboard πŸš€:

python run.py