Closed shuwang21 closed 1 year ago
I'm unable to reproduce this (I have a different python version though). Does the JSONDecodeError happen when you run search_query = scholarly.search_pubs('Acoustic Eavesdropping through Wireless Vibrometry')
? Which version of json
do you have? You can find that out by typing import json; print(json.__version__)
in python.
Same issue here,
from scholarly import ProxyGenerator
# Set up a ProxyGenerator object to use free proxies
# This needs to be done only once per session
pg = ProxyGenerator()
pg.FreeProxies()
scholarly.use_proxy(pg)
# Now search Google Scholar from behind a proxy
search_query = scholarly.search_pubs('Perception of physical stability and center of mass of 3D objects')
scholarly.pprint(next(search_query))
returns
raise MaxTriesExceededException("Cannot Fetch from Google Scholar.")
scholarly._proxy_generator.MaxTriesExceededException: Cannot Fetch from Google Scholar.
even though
pg.FreeProxies()
returns True
I am still unable to reproduce the JSONDecodeError. The query passed for me with ScraperAPI but not with FreeProxies.
The MaxTriesExceededException is not uncommon when using FreeProxies. Once you update to the just released v1.7.2, and may be once the existing set of proxies in FreeProxies get unblocked, you should be able to use them again.
The json version is 2.0.9.
For ScraperAPI, the issue happens with pg.ScraperAPI(api_key)
.
I am facing the same issue. Has there been any progress on this or tricks to get around it?
Same problem
Same problem
Does "Same problem" mean getting "JSONDecodeError", or "MaxTriesExceededException" error?
I got the JSONDecodeError when I (intentionally) gave an API_KEY that was invalid. When I gave my correct API_KEY, I did not get that error. So I assume that you supplied an invalid API key.
As for MaxTriesExceededException
error, it depends on what proxies are available when you query, and whether they have been used sufficiently (by you and everyone else in the world). It is unfortunately not possible to reliably fetch publications using search_pubs
with FreeProxies
. However, you should be able to catch that exception and retry again from your program.
Describe the bug Really nice work. It seems that some of the proxy-related queries failed. I tired
pg.Tor_Internal(tor_cmd ="xxx")
andpg.FreeProxies()
. Both give me the following errorI also tried
pg.ScraperAPI(api_key)
, and it also gives the following errors.Thanks a lot for your help.
To Reproduce For ScraperAPI,
pg.ScraperAPI(api_key)
For freeproxy and tor,search_query = scholarly.search_pubs('Perception of physical stability and center of mass of 3D objects')
Expected behavior Should be able to return the correct result
Screenshots None
Desktop (please complete the following information):
Do you plan on contributing? Your response below will clarify whether the maintainers can expect you to fix the bug you reported.
Additional context Add any other context about the problem here.