leftmove / wallstreetlocal

Free and open-source stock tracking website for America's biggest money managers.
https://wallstreetlocal.com
MIT License
435 stars 36 forks source link

Problematic share held? #4

Closed bolirev closed 5 months ago

bolirev commented 5 months ago

Hi, I am trying to understand the table displayed on wallstreet local. For the files 1067983 (Berkshire Hathaway Inc), I get the following recent table: (https://www.wallstreetlocal.com/filers/1067983) image

We see that Berkshire Hathaway Inc then own roughly 9M shares of Apple (ticker AAPL). However from other sources, we know that Berkshire owns roughly 905M shares (e.g. https://www.investopedia.com/articles/investing/022816/top-5-positions-warren-buffetts-portfolio.asp)

Using the link provided on wallstreet local referring to the data source, I found this table: https://www.sec.gov/Archives/edgar/data/1067983/000095012324002518/0000950123-24-002518.txt. Looking for Apple, I get the following entry for the shares:

692000+3840000+24294000+59147916+2724000+20424207+61542988+12152000+47832000+666422889+2712000+3776000

Which indeed total to 905M shares.

Therefore I am not sure whether it is a bug in wallstreet local, or I am misunderstanding the results displayed on the webpage.

leftmove commented 5 months ago

After doing some digging, I found that this was indeed a bug.

If you look at the stocks from an indivdual filing, you'll see that each stock isn't listed just once. Apple stocks from Berkshire Hathaway (This picture uses the SEC's HTML format because it's more readable, but it should contain the exact same info as the link you sent)

I don't know why this is, but it means that in order to get totals for things like market value, you need to add up every value listed for each stock according to its name (or CUSIP). While this was done for market value, I failed to see that the same totaling method must be done for shares held as well.

To fix the issue, I just had to write one extra line of code copying the market value function. It should be fixed now, and all filers in the database should be rebuilt soon.

Thanks you for bringing this to my attention!