Open smic-datalabs-von opened 4 years ago
Can you use the mechanism described in https://github.com/jupyter-incubator/sparkmagic/#conf-overrides-in-code to override the config in your code?
Yes but that would defeat the purpose of automatic assignment of session names. Most of our users would often forget this. Moreover, having sparkmagic-specific preamble code for every notebook is not something we are keen on having.
Is this still relevant?
To keep backward compatibility, maybe it could rather be a new property, eg.
"session_configs": {
"name_prefix": "jerry"
}
still very relevant: we have the exact same issues - wanting to identify the user, yet needing to allow multiple concurrent kernels for hte same user
I like juhoautio's parameter suggestion
Wondering if anyone has found a solution to this issue?
Not sure if this project is still active or went private. I am looking for alternatives at this point. Any suggestions?
My team connects to a Dockerized JupyterHub that spawns Jupyterlab containers with sparkmagic installed. Thing is, we are having problems knowing which user is using which session when viewing the Spark Master UI.
We thought that placing a modified
config.json
file withname
key altered undersession_configs
key would take care of this. For example:However, we encountered another problem: If the same user opens up a new PySpark kernel, it would raise an error saying that there is already a session with same
name
, which isjerry
in this case.What we want is that each session opened up by the user with
jerry
config, would have a name like this:jerry-<session-id>
.So for example, each session opened by
jerry
should bejerry-1
,jerry-2
, and so on.We use Spark Master UI as our main monitoring view for sessions opened in our Spark cluster.
Is my request possible?