Open aculich opened 3 years ago
Also, @ryanlovett pointed me to this page: https://docs.datahub.berkeley.edu/en/latest/admins/howto/course-config.html
Maybe I could configure these as profiles?
@aculich sorry about the late response. Are you still interested in making profiles happen? If so we can try it out by adding config to https://github.com/berkeley-dsep-infra/datahub/blob/staging/deployments/dlab/config/common.yaml
Yes, I'd like to try it out still if possible. Let me know what next steps would be?
@yuvipanda Just bringing this issue back for discussion! Is this something we could scope for this sprint?
I would love to see that if possible! Thanks for bringing this back on the radar, Balaji!
Which hub do you want more RAM on?
dlab.datahub.berkeley.edu
Which class is this request for?
Since dlab has informal workshops and projects rather than formal courses, it is not listed in the regular course catalog.
Let me know if there is some other mechanism we should put in place to make it easy to automate, such as using a google group @lists.berkeley.edu or using CalGroups.
I'm happy to help contribute code to extend JupyterHubs to be able to do this.
In the meantime, should I give a list of email addresses of the people who need to have a RAM increase? Or submit a pull request?
How many students do you expect in this class?
This is for two different workshop-projects each with 5-10 people
How much RAM does this class need?
4GB of RAM per student, however some (but not all) may need up to 8GB
Why does this class need this much RAM?
Working with larger datasets in-memory.
Any additional information we should know about?
The amount of RAM needed is occasionally large, not sustained usage. If it is possible to do node-pinning for the students in each group all share a single node, that might be a good way to go, as they can fight amongst themselves for using the resources on that node.
Again, I'm happy to help contribute code under the hood to make this happen, and open to suggestions for other better ways to implement this.