Closed markxbaker closed 1 month ago
To add to this am running in colab, I have previously run several rules based backtests with 600+ tickers without issue.
Hi @markxbaker,
Do you mean that without using any models, this works fine? My guess is you're running out of memory when loading models for each of those 50-100 tickers.
Can you try passing disable_parallel=False
to backtest()? https://www.pybroker.com/en/latest/reference/pybroker.strategy.html#pybroker.strategy.Strategy.backtest
Yes with no models works ok, will try as suggested and also post code when I get chance and thanks. Have a good weekend!On 27 Sept 2024 19:35, Ed West @.***> wrote: Hi @markxbaker, Do you mean that without using any models, this works fine? My guess is you're running out of memory when loading models for each of those 50-100 tickers. Can you try passing disable_parallel=False to backtest()? https://www.pybroker.com/en/latest/reference/pybroker.strategy.html#pybroker.strategy.Strategy.backtest
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
This seems to work ok thanks: disable_parallel=False to backtest()
Was a memory issue, upgraded colab and resolved, the above worked to a point then crapped out again.
Hi,
When running a strategy using a cached model and a few indicators when it gets above a certain number of instruments I get the below issue , works ok up to 10 instruments, but when I tried 50 or 100 etc it dies. Is it possible to extend the worker timeout or find anotjer workaround?
Backtesting: 2014-01-01 00:00:00 to 2024-01-09 00:00:00
Loading bar data... [*100%***] 100 of 100 completed Loaded bar data: 0:00:23
Computing indicators... 0% (0 of 3800) | | Elapsed Time: 0:00:00 ETA: --:--:--/usr/local/lib/python3.10/dist-packages/joblib/externals/loky/process_executor.py:752: UserWarning: A worker stopped while some jobs were given to the executor. This can be caused by a too short worker timeout or by a memory leak. warnings.warn(