I want to use find
to find files in a set of folders restricted by wildcards, but where there are spaces in the path name.
From the command line, this is easy. The following examples all work.
find te*/my\ files/more -print
find te*/'my files'/more -print
find te*/my' 'files/more -print
These will find files in, for example, terminal/my files/more
and tepid/my files/more
.
However, I need this to be part of a script; what I need is something like this:
SEARCH='te*/my\ files/more'
find ${SEARCH} -print
Unfortunately, whatever I do, I don't seem to be able to mix wildcards and spaces in a find
command within a script. The above example returns the following errors (note the unexpected doubling of the backslash):
find: ‘te*/my\\’: No such file or directory
find: ‘files/more’: No such file or directory
Trying to use quotes also fails.
SEARCH="te*/'my files'/more"
find ${SEARCH} -print
This returns the following errors, having ignored the meaning of the quotes:
find: ‘te*/'my’: No such file or directory
find: ‘files'/more’: No such file or directory
Here's one more example.
SEARCH='te*/my files/more'
find ${SEARCH} -print
As expected:
find: ‘te*/my’: No such file or directory
find: ‘files/more’: No such file or directory
Every variation that I've tried returns an error.
I have a workaround, which is potentially dangerous because it returns too many folders. I convert all spaces to a question mark (single-character wildcard) like this:
SEARCH='te*/my files/more'
SEARCH=${SEARCH// /?} # Convert every space to a question mark.
find ${SEARCH} -print
This is the equivalent of:
find te*/my?files/more -print
This returns not only the correct folders but also terse/myxfiles/more
, which it's not supposed to.
How can I achieve what I'm trying to do? Google has not helped me :(
The exact same command should work fine in a script:
If you need to have it as a variable, it gets a bit more complex:
WARNING:
Using
eval
like that is not safe and can result in executing arbitrary and possibly harmful code if your file names can contain certain characters. See bash FAQ 48 for details.It's better to pass the path as an argument:
Another approach is to avoid
find
altogether and use bash's extended globbing features and globs:The
globstar
bash option lets you use**
to match recursively:To make it act 100% like find and include dotfiles (hidden files), use
You can even
echo
them directly without the loop:How about arrays?
(*)
expands into an array of whatever matches the wildcard. And"${SEARCH[@]}"
expands into all the elements in the array ([@]
), with each individually quoted.Belatedly, I realise find itself should be capable of this. Something like:
It's a little dated now but, if that can help anybody with this question, using RE's collating symbol
[[.space.]]
without quoting$SEARCH
variable within find command line arguments, is working as long as there's no special characters into the expanded path-names in place of the asterisk.give the following results:
To prevent globing unwanted chars, One can replace the asterisk (
*
) by any collating elements (characters):giving the following results:
Note that to shorten the line,
[.hyphen.]
can be replaced by either[.-.]
or-
and[.underscore.]
can be replaced by either[._.]
or_
.Truncating lines with backslashes (
\
) added by myself for readability purpose.I have finally found out the answer.
Add a backslash to all spaces:
At this point,
SEARCH
containste*/my\ files/more
.Then, use
eval
.It's that simple! Using
eval
bypasses the interpretation that${SEARCH}
is from a variable.