Today my dev server ran out of space. Which was a little bit weird - 20GB should be enough for a bunch of websites. I managed to free up to 6 GB of space just doing the following:
1. Aptitude cache
sudo apt-get autoremove
I had to run this 2 or 3 times until nothing could be removed anymore. Not sure why it didn't clean up everything from the first pass.
2. Logs
In my case journal logs (/var/log/journal) used more than 1GB of space. I found a cool thread on Arch forum: Is it safe to delete /var/log/journal log files? -- with two useful commands:
journalctl --disk-usage # See how much space your journal logs use systemctl kill --kill-who=main --signal=SIGUSR2 systemd-journald.service # Force journal to execute cleanup procedure
In my case /etc/systemd/journald.conf
didn't have any values, so I had to un-comment the defaults and change the following:
SystemMaxUse=16M ForwardToSyslog=no
After that I tried the second command, which didn't help. Being a dev server, few minutes of downtime are ok so I restarted and then the journal logs are down to ~ 135M now (not sure why not 100M, but I'm happy anyway).
3. Find biggest folders
Cleaning up unused packages and logs didn't free up as much space as I wanted. I wanted to check what folders had biggest disk usage, but I had no idea how to do it. After some searching ended up on How to Find Out Top Directories and Files (Disk Space) in Linux which is exactly what I needed. I used the following command to find top 20 biggest folders in the current directory. I started from root (/
) and ended up in /var/lib.
du -hs * | sort -rh | head -n 20
Using this I was able identify other folders that grew out of proportion:
- drush backup folder
- cache_form table
- docker images (https://forums.docker.com/t/how-to-delete-cache/5753/2)
- mysql slow query log
Comments