Closed wikfeldt closed 1 year ago
const.jl
where all constants are defined with const
keyword (important for performance), e.g. const x = 1.0
there will most likely be a module to load on LUMI, i'm working with a CSC person to have it up in the coming days.
I will be tweaking example episode through the next weekend, aren't I 😄
idiomatic way to define constants in one place is to have a file like
const.jl
where all constants are defined withconst
keyword (important for performance), e.g.const x = 1.0
OK, will do.
what do you think about python/numba, should we demo how to use an old version of numba which supports AMD GPUs? doesn't feel quite right considering that they stopped supporting AMD GPUs, but at least it would be something for the pythonistas. I'm not sure how the future looks for AMD GPU programming in python
This is one of the consequences of us choosing LUMI: as I understand, Python/numba works fine on Nvidia GPUs, and also as I understand, such code should be portable/ transparent to the architecture (because if even Python is not, then what are we doing here). So jumping through some loops this run to make sure that the same lesson will be applicable for Nvidia participants/ workshop hosts in the future.
Are you OK with keeping this as PR, or should I merge and then revise?
we can keep the PR open until necessary revisions are done
btw, this episode will now be taught on friday, not monday next week!
regarding numba, how about spinning up a google colab notebook to demo the numba examples? maybe better than not having any hands-on at all
regarding numba, how about spinning up a google colab notebook to demo the numba examples? maybe better than not having any hands-on at all
Yup, let's do that. I actually started developing Python variant on Colab, but it did not agree to ArgParse
, so I moved to local setup later. (And of course the execution will be slow in comparison, but the gains should be visible in Colab-CPU vs. Colab-GPU at least.)
What would be the steps to run this on LUMI -- or anywhere? Both julia example.jl
and julia> include("example.jl")
complain about missing HeatEquation
package, which sounds reasonable, as none of the included files have the information about src/
directory.
i'll look into this this evening and make sure we have a working example tomorrow!
i'll look into this this evening and make sure we have a working example tomorrow!
Great! I got Python version working well which I'm finishing now, and I'm still clarifying the structure of that entire episode for tomorrow, so I have two small requests:
a) please commit updated / tested Julia files (as well as instructions for LUMI if they happen not to be straightforward) to stencil/julia
when you will, and
b) have a read through the updated 13-examples.rst
in existing draft pull request #76 and feel free to fix or comment on layout / callout box choices, to make them more consistent with the rest of the lesson.
@stepas-toliautas i've now simplified from a Julia module to three scripts roughly matching the python scripts, and added LUMI step-by-step instructions. Currently only threaded CPU version, will do GPU now but i suggest to merge right away
Thanks! Will there be a
module
to load for Julia on LUMI, or should we fetch the distribution/ place it on project'sscratch
folder? I'll tweak the output lines/ default options to give identical results to the other variants. Also, what would be the idiomatic way to collect constants (initial temps etc.) in one place and use names in the performant code? (As it's done in C++ and Python versions.) And, finally: will we be able to test Python/numba on LUMI? (v.env with an older version of numba, probably?)