Closed jshranik closed 6 years ago
Hi Shranik,
You can export to CSV fairly easily by converting your report
to a pandas.DataFrame
then using the DataFrame.to_csv
method.
For example:
report = webproperty.query.range('today', days=-7).dimension('query').get()
df = report.to_dataframe()
df.to_csv('report.csv', index=False)
You'll need to have pandas installed. If you don't already:
$ pip install pandas
By default, the package will paginate through all the data in your site unless you use query.limit
. Feel free to increase the page size from 5,000 to 25,000 and submit a pull request!
Thanks, Josh
Hey Josh ,
Thanks for the reply! I am an initial stage python learner, just want to confirm where to add the line of code in the file to increase the limit to request from 5k to 25k
I basically started from this query import searchconsole account = searchconsole.authenticate(client_config='auth/client_secrets.json') webproperty = account['https://www.example.com/'] report = webproperty.query.range('today', days=-7).dimension('query').get() print(report.rows)
Hi Shranik,
No worries that you're just getting started - this would be a perfect first issue if you're interested.
You'd need to:
paginate
branch.query.py
and is referenced in a few places.Look forward to hearing from you if you're up for it.
Cheers, Josh
@luismarcanth implemented this in #9 and I've merged it with the master branch. Thanks!
Hi ,
I want to extract data in the CSV file but not able to find any code related to that please help Also as now google console api allow to extract 25000rows/request i want to extract how to implement pagination to get all data of my site