How to release or reset the GPU memory in Colaboratory notebook


Google has announced that it will provide the GPU enabled Colaboratory notebook for free to anyone to use. You can use K80 GPU for free up to 12 hours, which might not be long enough for large neural network training, but, at the same time, is long enough for training and educational purpose.

However, it is easy to mess up the Colab environment, and you'll end up with your GPU memory full, and won't be able to produce any variables or computational graphs on GPU.

I googled to resolve this situation, but couldn't find any proper solution. I've tried following steps, and it worked. You can easily follow the nest steps and you will have your GPU memory back.

Good luck!

Note: "!" mark enables to commit terminal commands, such as nvidia-smi, ps, and top.

  • GPU status when GPU memory is full
!/opt/bin/nvidia-smi

Wed Jun 20 08:04:20 2018 +-----------------------------------------------------------------------------+ | NVIDIA-SMI 384.111 Driver Version: 384.111 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | |===============================+======================+======================| | 0 Tesla K80 Off | 00000000:00:04.0 Off | 0 | | N/A 37C P0 69W / 149W | 11439MiB / 11439MiB | 0% Default | +-------------------------------+----------------------+----------------------+ +-----------------------------------------------------------------------------+ | Processes: GPU Memory | | GPU PID Type Process name Usage | |=============================================================================| +-----------------------------------------------------------------------------+


  • Display all the processes related with python 
!ps -aux|grep python

root 92 0.1 0.4 189100 61912 ? Sl 01:21 0:34 /usr/bin/python2 /usr/local/bin/jupyter-notebook -y --no-browser --log-level=DEBUG --debug --NotebookApp.allow_origin="*" --NotebookApp.log_format="%(message)s" --NotebookApp.disable_check_xsrf=True --NotebookApp.token= --Session.key="" --Session.keyfile="" --ContentsManager.untitled_directory="Untitled Folder" --ContentsManager.untitled_file="Untitled File" --ContentsManager.untitled_notebook="Untitled Notebook" --NotebookNotary.algorithm="sha1" --KernelManager.autorestart=True --MultiKernelManager.default_kernel_name="python2" --ip="127.0.0.1" --port=9000 --port-retries=0 --notebook-dir="/content" --NotebookNotary.algorithm=sha256 --NotebookNotary.secret_file=/content/datalab/.config/notary_secret --NotebookApp.base_url=/tun/m/gpu-af18037e-50ae-419a-92e0-348f6d32a828/ root 100 0.3 12.7 41745048 1705896 ? Ssl 01:21 1:31 /usr/bin/python3 -m ipykernel_launcher -f /content/.local/share/jupyter/runtime/kernel-95cb65b9-23eb-4f87-801b-d995ca30fc32.json root 25937 15.7 1.0 699440 141736 ? Ssl 08:06 0:02 /usr/bin/python3 -m ipykernel_launcher -f /content/.local/share/jupyter/runtime/kernel-1121b7a1-4121-4745-b3ac-952f58c20fb9.json root 25969 16.6 1.0 699444 142112 ? Ssl 08:06 0:02 /usr/bin/python3 -m ipykernel_launcher -f /content/.local/share/jupyter/runtime/kernel-68d848cd-22f7-4fe0-858d-1ea1cde7ea2a.json root 25986 16.8 1.0 699444 141628 ? Ssl 08:06 0:02 /usr/bin/python3 -m ipykernel_launcher -f /content/.local/share/jupyter/runtime/kernel-efaf435d-03bd-4447-9f01-9afd67b270ba.json root 26003 18.9 1.0 705776 142288 ? Ssl 08:06 0:02 /usr/bin/python3 -m ipykernel_launcher -f /content/.local/share/jupyter/runtime/kernel-b9ae1e9f-9547-4070-a830-e748673b527e.json root 26033 15.5 1.1 704176 151296 ? Ssl 08:06 0:02 /usr/bin/python -m ipykernel_launcher -f /content/.local/share/jupyter/runtime/kernel-3987dd8b-4cea-49f5-ae72-9038b3b21108.json root 26054 19.0 1.0 699696 142348 ? Ssl 08:06 0:02 /usr/bin/python3 -m ipykernel_launcher -f /content/.local/share/jupyter/runtime/kernel-8fa2f852-e62a-4bc0-a8ca-39b91f5273ed.json root 26337 20.2 1.0 699444 141428 ? Ssl 08:06 0:02 /usr/bin/python3 -m ipykernel_launcher -f /content/.local/share/jupyter/runtime/kernel-25bf1ce9-abea-465d-a332-c4b54b32d7f4.json root 26365 20.3 1.0 699444 141888 ? Ssl 08:06 0:02 /usr/bin/python3 -m ipykernel_launcher -f /content/.local/share/jupyter/runtime/kernel-cc667cad-b6a9-4cca-8be9-2b4365cfff32.json root 26404 21.8 1.0 699440 141716 ? Ssl 08:06 0:02 /usr/bin/python3 -m ipykernel_launcher -f /content/.local/share/jupyter/runtime/kernel-0b81476d-89ff-4ebb-89e4-405637fec125.json root 26426 22.5 1.0 699440 141572 ? Ssl 08:06 0:02 /usr/bin/python3 -m ipykernel_launcher -f /content/.local/share/jupyter/runtime/kernel-d6fae180-6195-46c2-b79a-1b6e51fbbaf9.json root 26444 106 0.0 33960 5084 pts/0 Ss+ 08:06 0:01 /bin/sh -c ps -aux|grep python root 26446 0.0 0.0 38200 5524 pts/0 S+ 08:06 0:00 grep python


  • Kill all the processes with "ipykernel_launcher" 
!kill -9 PID1 PID2, ... 

!kill -9 25937 25969 25986 26003 26033 26054 26337 26365 26404 26426 26444 26446
  • GPU Status After GPU memory has been cleared
Wed Jun 20 08:06:53 2018 +-----------------------------------------------------------------------------+ | NVIDIA-SMI 384.111 Driver Version: 384.111 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | |===============================+======================+======================| | 0 Tesla K80 Off | 00000000:00:04.0 Off | 0 | | N/A 37C P0 69W / 149W | 257MiB / 11439MiB | 0% Default | +-------------------------------+----------------------+----------------------+ +-----------------------------------------------------------------------------+ | Processes: GPU Memory | | GPU PID Type Process name Usage | |=============================================================================| +-----------------------------------------------------------------------------+

댓글