Closed w-markus closed 3 years ago
Thanks for the report! Does this happen when running a "plain" Python script, or from some kind of REPL (ipython or similar)?
When using a plain Python script, you need to guard the Context
creation, like this:
import libertem.api as lt
if __name__ == "__main__":
with lt.Context() as ctx:
ds = ctx.load("...") # etc.
I suspect this is what's happening here. See also the basic example in the LiberTEM docs
Yes, correctly guessed, I was running a "pure" python script from within vscode. And yes, the suggested guard did the trick, whereas both lines are necessary, ``ìf name ...as well as
with ...```.
Thanks a lot!
Perhaps we leave this issue open until I have been able to run the full example?
Perhaps we leave this issue open until I have been able to run the full example?
Sure, sounds good! Maybe we should include a pointer to the documentation in our notebooks, too.
Am Freitag, dem 30.04.2021 um 07:11 -0700 schrieb Alexander Clausen:
Perhaps we leave this issue open until I have been able to run the full example? Sure, sounds good! Maybe we should include a pointer to the documentation in our notebooks, too. — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe. oh yes, that's something for the documentation. :-)
Do these two line make problems, when included in a notebook run script?
Do these two line make problems, when included in a notebook run script?
Yes, because you can't really have a with
-statement that wraps all the notebook cells. The with
statement could be replaced with a ctx.close()
at the end, but that is also inconvenient for users that just "run all cells" and want to keep experimenting in the notebook afterwards.
Wrapping all cells into an if __name__ == "__main__"
has similar problems. I think it would be best to include a non-executing code snippet in a markdown cell, like the one in my comment above.
Am Freitag, dem 30.04.2021 um 07:27 -0700 schrieb Alexander Clausen:
Do these two line make problems, when included in a notebook run script? Yes, because you can't really have a with-statement that wraps all the notebook cells. The with statement could be replaced with a ctx.close() at the end, but that is also inconvenient for users that just "run all cells" and want to keep experimenting in the notebook afterwards. Wrapping all cells into an if name == "main" has similar problems. I think it would be best to include a non-executing code snippet in a markdown cell, like the one in my comment above. — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
oh, yes of course! This reminds me of: -- of topic -- of topic -- of topic -- of topic -- of topic --
https://docs.google.com/presentation/d/1n2RlMdmv1p25Xy5thJUhkKGvjtV-dkAIsUXP-AL4ffI/edit#slide=id.g362da58057_0_1 ;-) -- of topic -- of topic -- of topic -- of topic -- of topic -- of topic
@w-markus yes, using notebooks requires some discipline. The slides show some nice "don'ts". Before sharing a notebook I always restart kernel, run all cells, save and shut down. That pretty much avoids the problem.
In particular for large data analysis they are pretty great, they are my preferred prototyping method. Importing everything, starting up a cluster, warming up the workers etc. takes its time. The full notebook also often has a few "number crunching" steps, for example first sum analysis, then COM analysis, then trotter generation, then ptychography. If I want to quickly benchmark some code changes in the ptycho routine, it is just great to
%autoreload
udf = SSB_UDF(...)
%time res = ctx.run_udf(...)
or change a bit of code in the UDF definition in a cell above and just run it, without going through the entire code that leads up to it. The same goes for a quick %lprun ...
to see where that code spends its time etc. And we get our examples with figures and all embedded in our documentation, and they are at the same time runnable!
Problem solved!
when starting the SSB example, upon creating the context:
I get numerous copies of:
Software runs on a HPE ProLiant DL385 Gen10, 2x Epyc 7F72, 512 GB RAM under Debian Linux, testing distribution.