Closed joseaveldanes closed 2 years ago
This repo is in progress - I assumed you meant to make this on Python-Text-Analysis-Fundamentals. I'm going to make this private in the meantime to avoid confusion.
I'm confused about all the different issues related to this?!
But in any case, the generic campus datahub at https://datahub.berkeley.edu/ is limited to 1GB However we have a D-Lab specific JupyterHub at https://dlab.datahub.berkeley.edu/ has 2GB available (and it is potentially possible to increase that limit, so consider using that instead.
Let me know if you have more questions it.
Sorry folks,
I accidentally made a second issue in the same erroneous repository. And then resolved immediately because i didn’t know how to delete it.
Good to know! Thank you.
All my best, JMA
On Thu, Feb 17, 2022 at 3:52 PM Aaron Culich @.***> wrote:
I'm confused about all the different issues related to this?!
But in any case, the generic campus datahub at https://datahub.berkeley.edu/ is limited to 1GB However we have a D-Lab specific JupyterHub at https://dlab.datahub.berkeley.edu/ has 2GB available (and it is potentially possible to increase that limit, so consider using that instead.
Let me know if you have more questions it.
— Reply to this email directly, view it on GitHub https://github.com/dlab-berkeley/Python-Text-Analysis/issues/3#issuecomment-1043644470, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANW7U2EDQNTAG45LMMRJ3ULU3WC5DANCNFSM5OWDLCIA . You are receiving this because you modified the open/close state.Message ID: @.***>
--
-- Jose Martin Aveldanes
Ph.D Student i n Sociology University of California, Berkeley Barrows Hall, Berkeley CA, 94720
After using up 1 GB of memory on JupyterHub, the kernel fails. We are not sure of how to remove objects that currently exist in the Jupyter Notebook so that we don't use up all of the memory. Users were having issues with this as well.