Closed EcoFin closed 2 years ago
Arthur,
Thanks for making me aware of this. I hadn't noticed this before. I was just able to reproduce what you are seeing. I will have a look into this soon.
Farrell
Farrell,
I looked at adjust_percents but didn’t see anything right away. I’ll keep looking too. I just tripped across it by accident.
arthur
From: Farrell Aultman @.> Sent: Sunday, March 20, 2022 4:14 AM To: fja05680/pinkfish @.> Cc: EcoFin @.>; Author @.> Subject: Re: [fja05680/pinkfish] Unstable backtest results (Issue #54)
Arthur,
Thanks for making me aware of this. I hadn't noticed this before. I was just able to reproduce what you are seeing. I will have a look into this soon.
Farrell
— Reply to this email directly, view it on GitHub https://github.com/fja05680/pinkfish/issues/54#issuecomment-1073195721 , or unsubscribe https://github.com/notifications/unsubscribe-auth/AB3HOERDVHGC5TZE3XMOAF3VA3M43ANCNFSM5REMVAOQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub . You are receiving this because you authored the thread.Message ID: @.***>
Arthur,
You were right about where you thought the error is, it was in portfolio.adjust_percents(). Symbols are sorted by weight so that the sells happen before the buys on the same day, however after the symbols were sorted into weighted dictionaires, the sort dictionaries weren't actually used. You can see the fix in pinkfish/porfolio.py. Here is the change. You can see that it was selling by portfolio symbol order and NOT using the sorted dict w
, which is sorted in descending order by weight.
for symbol in self.symbols: (DELETED)
for symbol, weight in w.items(): (ADDED)
price = prices[symbol]
weight = w[symbol] (DELETED)
direction = directions[symbol]
self.adjust_percent(date, price, weight, symbol, row, direction)
return w
Thanks for finding and reporting this bug.
Farrell
P.S. I may follow up with you regarding Norgate when I get some free time. I want to see if perhaps they will temporarily give me free access to integrate their product into pinkfish.
Farrell,
You got a step further than I did. I understood that it loaded up the dicts; “reverse ordered them” and that should have meant that the sells were done first.
I didn’t actually notice that it wasn’t used. But what you describe is exactly what I figured it had to be. This is great! I’ll try to implement the fix tomorrow.
Richard Dale is the primary contact at Norgate. They are just very helpful all round.
I’d be pleased to provide any practical assistance I can on the side. I dumped CSI about 18 months ago for a complete Norgate subscription and could not be happier. There is actually a lot more stuff in their db than what gets exported to the csv files. It seems to me that it wouldn’t be hard to generate an actual ts dataframe from Norgate’s db (just skipping th existing fetch_timeseries entirely). I have thought of writing the interface myself. But for the time being I have just written a code snippet to pre-process Norgate csv files.
Thanks again!
arthur
From: Farrell Aultman @.> Sent: Sunday, March 20, 2022 8:41 PM To: fja05680/pinkfish @.> Cc: EcoFin @.>; Author @.> Subject: Re: [fja05680/pinkfish] Unstable backtest results (Issue #54)
Arthur, You were right about where you thought the error is, it was in poirtfolio.adjust_percents(). Symbols are sorted by weight so that the sells happen before the buys on the same day, however after the symbols were sorted, the sort dictionaries weren't actually used. You can see the fix in pinkfish/porfolio.py. Here is the change. You can see that it was selling by symbol order and NOT using the sorted dict w, which is sorted in descending order by weight. (-) is the deleted code and (+) is the added code.
Call adjust_percents() for each symbol.
for symbol in self.symbols:
for symbol, weight in w.items():
price = prices[symbol]
weight = w[symbol]
direction = directions[symbol]
self.adjust_percent(date, price, weight, symbol, row, direction)
return w
Thanks for finding and reporting this bug.
Farrell
P.S. I may follow up with you regarding Norgate when I get some free time. I want to see if perhaps they will temporarily give me free access to integrate their product into pinkfish.
— Reply to this email directly, view it on GitHub https://github.com/fja05680/pinkfish/issues/54#issuecomment-1073387642 , or unsubscribe https://github.com/notifications/unsubscribe-auth/AB3HOEW34DUUXJYDSLDG7NLVA7ASXANCNFSM5REMVAOQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub . You are receiving this because you authored the thread.Message ID: @.***>
Farrell,
This really has me baffled... Consider your Example 200 (Antonacci GEM model). To keep things simple I will just talk about CAGRs (other stats more or less follow suit). Here is a summary of my experiments
You show 10.59% in the downloaded notebook. Seems plausible to me. So I ran it on my Norgate data to verify my pre-processing code. But instead of 10.6%, I got about 8.5%. That seemed odd and my initial thought was a data problem. So I looked at the logs. I found your version missing trades on yahoo data that are taken on Norgate data For example, using yahoo data you are net zero on 3 May 2010 and only take a SPY position on June 1. On Norgate data, I got a position on SPY on May 3 (at a higher prce!); I think that is correct. The Antonacci model should never be out of the market. But whatever.
I am also seeing a variable number of miniscule trades (1 share or 2 shares) on different runs. That suggests that maybe sells aren't happening before buys. But again, let's leave that for now.
The point is that if I clear outputs and rerun on the same data (doesn't matter whether it's from yahoo or Norgate), I get a different CAGR more or less at radom! I might get 8.5%, 9.5%, 9.9%, 10.5% or 11.91%. No way of predicting. I normally work in VS Code. I have tried "clear contents/rerstart kernel" for consecutive runs; I havae tried complete shutdown/restart of VS code. On the off chance that there is something wrong with my VS Code setup, I just reran some of the experiments in Juputer Lab. Same deal. I saved 3 examples to html if you want me to send them. It's not accidentally picking up your random lookback code. I make it print the lookback to make sure.
Bottom line: two consecutive runs on the same data will give different results. In my case, always; I have never had two in a row the same. That means the backtest results can't be trusted. I think it may have to do with how the adjust_percent rebalancing code places orders. I will try to figure that out. Beyond that, I'm not sure what to look for.
Have run into this before? Any advice?
Best regards!