Quantcast
Channel: El blog de Deigote
Viewing all articles
Browse latest Browse all 19

Bash – find directories of certain sizes

$
0
0

I often find myself trying to know which directories are consuming most of my persistent memory (I would say hard disk, but it isn’t a disk – and probably, even hard – anymore), and always get disappointed with myself for not being able to remember the way to do it in the terminal :) . If you are dealing with files, you can use the allmighty find command with the size filter, for example:

find . -type f -size +100M

But that won’t work with directories. After all, I guess they don’t take much spaces themselves – the files contained by them are the ones that does :-) . So what I usually end up doing is finding directories with a certain depth, and then calculate their sizes and filter with grep. This would be an example:

find $HOME -type d -mindepth 1 -maxdepth 3 -exec du -hs {} \; | grep -E "[0-9]{1}(\.[0-9]{1,2})?G" | sort -n

This would list the directories contained in the user’s home, recursing til a max depth of 3 levels, and them calculate all their sizes. After that, it would filter by the sizes whose size is 1 gigabyte or more. Last, the sizes are sorted with a numeric criteria. You can change the last letter to filter by megabytes (M), and the count of the first regular expression so it takes only numbers of a given lenght. For example, to get sizes in the range [100M-1G), you would use the following regular expression: "[0-9]{3}(\.[0-9]{1,2})?M”. And to find directories with a size of at least 10 gigabytes, you would use: “[0-9]{2}(\.[0-9]{1,2})?G”. You can also change the mindepth and maxdepth params in order to filter how deep you want the recursion. And if you don’t provide a maxdepth param and set the base directory as /, the whole filesystem will be analized :).


Viewing all articles
Browse latest Browse all 19

Trending Articles