I have a directory containing a number of subdirectories. Each of these subdirectories contains a subdirectory which all have the same name. I would like to produce a list of all these files on the command line. So, for example, if I have:
dir1/
file1.txt
subdir/
relevant_file1.c
relevant_file2.c
dir2/
file2.txt
subdir/
relevant_file3.txt
relevant_file4.java
dir3/
subdir/
relevant_file5.cpp
irrelevant_subdir/
unimportant_file.txt
dir4/
subdir/
I would like to have the following output:
dir1/subdir/relevant_file1.c
dir1/subdir/relevant_file2.c
dir2/subdir/relevant_file3.txt
dir2/subdir/relevant_file4.java
dir3/subdir/relevant_file5.cpp
I presume this should not be too difficult using find
, but I not quite been able to figure it out. It's difficult to search for this problem because it is so specific, and just searching for "find file by matching on its path" doesn't produce anything useful.
try this
You don't actually need
find
for this. A simplels
will do:Or, if you want to do something with those files and not just list them, use shell globs:
Of course, that will list anything in any subdirectory called
subdir
. Including directories, symlinks, device files or anything else. If you want to limit it to normal files, usefind "*/subdir" -type f
as suggested by @Carl, or add a file test:A better workaround with
find
is to use-path
option.It's more useful when we have a more complex structure. something like:
In above example something like
find */subdir -type f
only finds files within a firstsubdir
. Howeverfind -path "*/subdir/*" -type f
will works fine.Just use the appropriate tool:
You may want to (re-)run:
if you need to refresh locate's database (for exemple if you want to find recently added files), but it should be built via crontabs every day (if the pc is running at that time)
If you need the initial updatedb, it may seem slower than the specific find, but in the end it's way faster as all subsequent queries are done FAST.