Closed zonca closed 4 years ago
It's quite a big effort, since it requires roughly a CPU month. I would have to make the weight generator batch-ready. Do we have a concrete use case?
yes, Simons Observatory has all Large Aperture Telescope maps at 8192
On Thu, Dec 12, 2019 at 12:54 AM mreineck notifications@github.com wrote:
It's quite a big effort, since it requires roughly a CPU month. I would have to make the weight generator batch-ready. Do we have a concrete use case?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/healpy/healpy-data/issues/1?email_source=notifications&email_token=AAC5Q4XEGWRPHZRNM72NXDTQYH35HA5CNFSM4JZ2RSRKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEGV5Z7Y#issuecomment-564911359, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAC5Q4WHU2PJJ3K4CKGPRRTQYH35HANCNFSM4JZ2RSRA .
I'll see what I can do. Most likely I won't have results before next year though ...
Thanks, mid-January would be ideal, would it help if you had access to Popeye? 48 cores and 768 GB/node
Thanks! But I should be able to use one of the MPA clusters. Actually there is a queue which might have sufficiently long runtime so that I can do this in one go...
I have the data, but Github's file size limit within a repo seems to be 100MB :-/
Ok, I'll setup git LFS and let you know
On Mon, Dec 16, 2019, 01:09 mreineck notifications@github.com wrote:
I have the data, but Github's file size limit within a repo seems to be 100MB :-/
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/healpy/healpy-data/issues/1?email_source=notifications&email_token=AAC5Q4VNK7WDQIB25E25K3TQY5AVJA5CNFSM4JZ2RSRKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEG6APKQ#issuecomment-565970858, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAC5Q4S3UAM23WSFWG3WOGTQY5AVJANCNFSM4JZ2RSRA .
git LFS doesn't support Github pages, so we could store the file in the repository, but it would not be served at https://healpy.github.io/healpy-data/
@mreineck how big is the file? can you gzip it?
The file has 400MB; 373MB gzipped, so no help there... On the other hand, this file is needed by perhaps 2 groups per year worldwide ... does it really need this kind of distribution channel?
Reduce to lower precision floating point numbers?
Then you get a factor 2 (still not enough), and a large part of the accuracy that can be achieved with this is lost.
@mreineck can you please attach the file uncompressed and at high-precision to (limit for release artifacts is 2 GB): https://github.com/healpy/healpy-data/releases/edit/1.0
Will do. This might take almost forever, since I'm on a bad ADSL line.
if you have a copy on a server can you copy it to NERSC?
thanks @mreineck, it is available at https://github.com/healpy/healpy-data/releases/download/1.0/healpix_full_weights_nside_8192.fits
had to rename the release to https://github.com/healpy/healpy-data/releases/tag/full_weights, to have the same naming.
Hi, sorry for bothering, but how to use these files? Is there a function (or an algorithm) to get each pixel weights from the file for given nside? There are less rows than npix.
It's a flag in map2alm, the filename is passed into C++, see https://github.com/healpy/healpy/blob/8710c9fcf901a0a3e4efff37593c8d9ea17c426b/healpy/sphtfunc.py#L286
On Thu, Jan 28, 2021, 07:33 Sergei Bykov notifications@github.com wrote:
Hi, sorry for bothering, but how to use these files? Is there a function (or an algorithm) to get each pixel weights from the file for given nside? There are less rows than npix.
— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/healpy/healpy-data/issues/1#issuecomment-769167399, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAC5Q4T7Q56MRYCSLORUKHLS4F7VDANCNFSM4JZ2RSRA .
@zonca Thanks, of course, but for my problem I would like to get the weights for each pixel precisely. Is it possible?
I don't know, you should look in the healpix C++ source code and check how they use it.
On Thu, Jan 28, 2021, 07:44 Sergei Bykov notifications@github.com wrote:
@zonca https://github.com/zonca Thanks, of course, but for my problem I would like to get the weights for each pixel precisely. Is it possible?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/healpy/healpy-data/issues/1#issuecomment-769174699, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAC5Q4XZPZJ5O3CAH3UDFYTS4GA4TANCNFSM4JZ2RSRA .
The relevant source code is in the routines extract_fullweights()
and apply_fullweights()
in weight_utils.cc
of Healpix C++. The code is a bit complicated, because it makes use of all the symmetries of the Healpix grid to only store the minimum subset of weights.
@mreineck would it be possible to also compute pixel weights for NSIDE 8192?