If you're working a lot of Docker containers then you probably spend a lot of disk space on it too. Just a single container that contains Tensorflow can be 1Gb in disk size so you need to remain mindful. If you want to get an impression how much disk space is used on your machine, just run:
docker system df
If you're running python containers with machine learning tools, prepare for many gigabytes of disk space being used. That's why, every once in a while, I just straight up prune my entire system with this command.
docker system prune -a
Note that this command will also remove all cached layers. It's a hard reset. This will mean that the next time you build your containers it will take a bit longer the first time around again. That's why if you want you can also try to be more picky by using other available prune commands.
docker container prune # Remove all stopped containers
docker volume prune # Remove all unused volumes
docker image prune # Remove unused images
Back to main.