snoplusuk / echidna

MIT License
4 stars 12 forks source link

Parallelsmearing #75

Closed Mark--S closed 8 years ago

jwaterfield commented 9 years ago

@ashleyrback Would you mind working on this PR while I work on Matts config PR? It is for the old versions of the spectra and reduces smearing time by ~ 1/2 using 4 cores. A similar method to this could be applied to the floating backgrounds limit setting method to improve its processing time.

ashleyrback commented 9 years ago

Sure, I'll take a look. Presumably we want to try and merge this before #68 then. And then you can merge the changes from this PR into #68. I don't think think there should be any conflicts anyway as it looks like #68 doesn't modify the smear module.

jwaterfield commented 9 years ago

Yup I think it'll take me a couple of days to get the config PR ready anyway

ashleyrback commented 9 years ago

@Mark--S despite all the comments, this looks like great work. I think we just need to refine the structure a bit so it is more modular and less repetition of code. If you want to discuss on Skype, or in person at the meeting, let me know. Cheers.

ashleyrback commented 9 years ago

Also don't forget to add yourself to the copyright --> see LICENSE.md

ashleyrback commented 9 years ago

Also there are lots of pep8 errors for smear.py and some for the other two files. To check pep8, first install (if not already installed):

$ pip install pep8

then check the file by just running:

$ pep8 file.py

I'm not so worried about line too long errors, especially if it is only just over 79 chars or if breaking the line would reduce readablility, although do look carefully at lines over 100 chars though. There are lots of errors involving missing or excess whitespace - it would be much appreciated if you could fix these.

ashleyrback commented 9 years ago

I'm going to delay this until after v0.3-alpha. @Mark--S we should have a chat about this at some point over Skype.

ashleyrback commented 8 years ago

parallelsmearing

ashleyrback commented 8 years ago

@Mark--S closing, for now, we could always re-open at a later stage