Open bjarthur opened 3 years ago
The bitpattern histogram from the talk is created in this notebook: https://github.com/milankl/Isambard/blob/main/src/bitpattern_histograms.ipynb, it's part of an upcoming publication, happy to share once it's on the preprint server!
I know the DrWatson documentation is far behind where I'd like it to be. Sorry, for that. It's alsmost too embarrasing to share, but this is how you can use DrWatson: https://github.com/milankl/Isambard/blob/main/src/watson_example.ipynb
Just execute your algorithm with it, interrupt at some point if the simulation takes too long, and get the stacktraces back. It works but fairly preliminary, if you have any ideas how to improve it, feel free to raise ideas!
ah, you used BFloat16. ok, i'll try that. but why restrict the logbooks to be 16 bits?
Well for a n-bit logbook you need to allocate an array of length 2^n of type UInt32/64 (I went for the latter just to minimize the risk of a wraparound in longer simulations). For 16 bit that's 130KB/260KB, for 32 bit that's 8.6GB/17.2GB and for 64 bit, well...
So yes for Float32 I use BFloat16, which is basically a very cheap binning method to collapse 2^16 values neighbouring values into one. For Float64, I'd probably either also use BFloat16 (as the conversion should be still fairly cheap Float64->Float32->BFloat16) but with the risk that some non-zero Float64 values are round to zero(BFloat16) or Inf32, or define a "CFloat16" with 1 sign, 11 exp and 4 mantissa bits similar to how BFloat16s.jl does it
you mentioned it at juliacon2021, but there is no mention in the README.
also, how did you generate the purple, green, and orange lines in that plot at juliacon? sherlogs.jl's logbook seems to just return everything as represented in 16-bit floats, but how do i get the 32- and 64-bit distribution?
thanks!