Open OrielResearchCure opened 4 years ago
I made a wrong decision and deleted /tmp/* using rm and now I am unable to connect to the machine with datalab connect. I tried creating a new datalab machine with the same disk...didn't help. Below is the df -h when looking at it from the docker.
docker exec -it 4fbf6bcd1591 df -h
Filesystem Size Used Avail Use% Mounted on
overlay 16G 5.2G 11G 34% /
tmpfs 64M 0 64M 0% /dev
tmpfs 7.4G 0 7.4G 0% /sys/fs/cgroup
shm 64M 0 64M 0% /dev/shm
/dev/sda1 16G 5.2G 11G 34% /var/log
overlayfs 7.4G 88K 7.4G 1% /etc/fluent/config.d
tmpfs 7.4G 0 7.4G 0% /proc/acpi
tmpfs 7.4G 0 7.4G 0% /proc/scsi
tmpfs 7.4G 0 7.4G 0% /sys/firmware
I am trying to copy the notebooks from the docker. I am able to see the file on the /mnt folder, however, its auto-complete the folders and the file names and still fires error: No such file or directory
docker exec -it 4fbf6bcd1591 ls /mnt/disks/datalab-pd/content/datalab/Eila/
ls: cannot access '/mnt/disks/datalab-pd/content/datalab/Eila/': No such file or directory
need help to resolve this asap :-( Thanks, eilalan
Reading more about the tools that I am using. tools from the NCBI , I understand that they are filling up the cache: https://standage.github.io/that-darn-cache-configuring-the-sra-toolkit.html found more posts about that. I don't know if this is helpful in helping me recover the notebook from the machine. Thanks a lot for any advice, Eila
Hello,
I am installing a few python libraries to update apache-beam. usually, it works okay. Today, I have received the following space error. It seems that /dev/sdb is out of space. How do I clean it?
Thanks, eilalan