alvarobartt / investpy

Financial Data Extraction from Investing.com with Python
https://investpy.readthedocs.io/
MIT License
1.59k stars 375 forks source link

ConnectionError: ERR#0015: error 403, try again later. #600

Open divyankm opened 1 year ago

divyankm commented 1 year ago

Code-

import investpy

df = investpy.get_stock_historical_data(stock='AAPL',
                                        country='United States',
                                        from_date='01/01/2010',
                                        to_date='01/01/2020')
print(df.head())

Error-

ConnectionError                           Traceback (most recent call last)
[<ipython-input-4-f6f4235b7e47>](https://localhost:8080/#) in <module>
      4                                         country='United States',
      5                                         from_date='01/01/2010',
----> 6                                         to_date='01/01/2020')
      7 print(df.head())

[/usr/local/lib/python3.7/dist-packages/investpy/stocks.py](https://localhost:8080/#) in get_stock_historical_data(stock, country, from_date, to_date, as_json, order, interval)
    663         if req.status_code != 200:
    664             raise ConnectionError(
--> 665                 "ERR#0015: error " + str(req.status_code) + ", try again later."
    666             )
    667 

ConnectionError: ERR#0015: error 403, try again later.
PolBarreiro commented 1 year ago

I used 15 seconds and was going smoooth (slow but smooth)

sampathkar commented 1 year ago

You can use [investiny] I do have the same error message using it. In fine, anyone has a reliable alternative as it can be serious to build a download script given this restriction. Anyone got answer from investing; com ?

You can use the workaround API that I built for this community: http://api.scraperlink.com/investpy/

For example, here's the sample API for stocks:

http://api.scraperlink.com/investpy/?email=your@email.com&type=historical_data&product=stocks&country=united%20states&symbol=TSLA&from_date=09/27/2022&to_date=09/28/2022

Thanks a lot for this :)

sampathkar commented 1 year ago

You can use [investiny] I do have the same error message using it. In fine, anyone has a reliable alternative as it can be serious to build a download script given this restriction. Anyone got answer from investing; com ?

You can use the workaround API that I built for this community: http://api.scraperlink.com/investpy/ For example, here's the sample API for stocks:

http://api.scraperlink.com/investpy/?email=your@email.com&type=historical_data&product=stocks&country=united%20states&symbol=TSLA&from_date=09/27/2022&to_date=09/28/2022

Thanks a lot for this :)

However, after using few times, I get below error msg.

Error message : Expecting value: line 1 column 1 (char 0)

webdevzilla commented 1 year ago

You can use [investiny] I do have the same error message using it. In fine, anyone has a reliable alternative as it can be serious to build a download script given this restriction. Anyone got answer from investing; com ?

You can use the workaround API that I built for this community: http://api.scraperlink.com/investpy/ For example, here's the sample API for stocks:

http://api.scraperlink.com/investpy/?email=your@email.com&type=historical_data&product=stocks&country=united%20states&symbol=TSLA&from_date=09/27/2022&to_date=09/28/2022

Thanks a lot for this :)

However, after using few times, I get below error msg.

Error message : Expecting value: line 1 column 1 (char 0)

I don't think that error is from me. What's the query you're using?

sampathkar commented 1 year ago

Ah it started working again. Thanks :)

wcmpeters commented 1 year ago

Hello, I am accessing the investing AJAX server via a POST command via R. I used this for more than 3 years but from early September this is not working anymore. Below the comment I got from investing.com

Regards Wim

Hello Wim,

Thank you for contacting us.

At the moment [Investing.com] does not have any support for access of services from R or other python editions. Also API and scraping are not allowed.

Our tool monitors for these and blocks the account. You can use the services on Specified browsers like chrome, edge, safari.

Regards, Puneeth