I've been playing with your library (mainly with Traffic Analysis methods), amazing work !
I have a couple of suggestions.
Querying logic
Bing'API is poor when it comes to traffic statistics because we can't filter and/or choose the dimensions we want. We need to use different endpoints for different purposes.
If I want to get the traffic for the /blog/ between two dates:
I'd call the get_page_stats function
I'd then have to filter myself the result to exclude other pages and dates
This can get extremely tricky based on the dimensions that you want and may require several API calls. But the data we get is more complete and the library would fit better in a data pipeline aimed at extracting our Bing's API data.
In a nutshell, my proposal would be to call these reports with a generic structure, similar to the approach I describe for my GSC library. We'd obviously have a different code structure to fit into your library.
webproperty = account['https://www.exemple.com/']
report = (
webproperty
#we call the query method
.query
#we define the dates
.range(start="2023-01-01", stop="2023-02-01")
#we define the dimensions
.dimensions(['page'])
#we get the data
.get()
)
Under the hood, the range, filter and dimensions methods creates an internal dict with the list of required dimensions. Based on that, the report is mapped with one or several Bing's endpoints and calls are made. Data is then grouped to keep only the dimensions listed in the dimensions calls.
I have a query.py code available here (private repo, I'd need to know who i need to grant access to beforehand) with this logic implemented. I think it could be included in your logic as well, in the traffic_analysis file to improve its usability.
Hi everyone,
I've been playing with your library (mainly with Traffic Analysis methods), amazing work !
I have a couple of suggestions.
Querying logic
Bing'API is poor when it comes to traffic statistics because we can't filter and/or choose the dimensions we want. We need to use different endpoints for different purposes.
If I want to get the traffic for the /blog/ between two dates:
get_page_stats
functionThis can get extremely tricky based on the dimensions that you want and may require several API calls. But the data we get is more complete and the library would fit better in a data pipeline aimed at extracting our Bing's API data.
In a nutshell, my proposal would be to call these reports with a generic structure, similar to the approach I describe for my GSC library. We'd obviously have a different code structure to fit into your library.
Under the hood, the
range
,filter
anddimensions
methods creates an internaldict
with the list of required dimensions. Based on that, the report is mapped with one or several Bing's endpoints and calls are made. Data is then grouped to keep only the dimensions listed in thedimensions
calls.I have a
query.py
code available here (private repo, I'd need to know who i need to grant access to beforehand) with this logic implemented. I think it could be included in your logic as well, in the traffic_analysis file to improve its usability.Let me know what you think :)
Have a lovely weekend,
Antoine.