Finding and clearing out large files

We have all been where you are experiencing some funky behaviour in a *nix OS and little do you know you have filled up a file system because some log or process has spun out of control and is continually writing to disk.

Well I done this again today without realising this is what was causing the issue. After painstakingly looking for other indicators like processes, host resource utilisation it finally dawned on me to check the file sysytem.

Below is the command that I formulated to find the 10 largest files…

du -h -a /var | sort -n -r | head -n 10

I run you through the commands and options:

  1. du – executable to estimate file space usage
    • -h = human readable format, i.e. convert from bytes to kilobytes/megabytes/gigabytes
    • -a = for all files and not just the directories
    • /var = the directory I used the command in you would change this to where you are searching
  2. sort – sort duh!
    • -n = sort based on numerical values as we are looking at the size of the files
    • -r = reverse i.e. give me the biggest first
  3. head – take only the top lines of the output in this case stdout
    • -n = number of lines to output (default is 10 anyway but thought I would include for awareness)

Once you have all the files you can chose which ones to rm… I will not be supplying that command, you use rm at your own peril!

Note – the | is the pipe which says take this and pipe it into the next chained executable of the whole command