I want to do something repeatedly on a list of files. The files in questions have spaces in their names:
david@david: ls -l
total 32
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha
-rw-rw-r-- 1 david david 0 Mai 8 11:55 haha~
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (3rd copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (4th copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (5th copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (6th copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (7th copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (another copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (copy)
Now I want to stat each of these files:
david@david: echo '
for file in $(ls)
do
stat $file
done' | bash
(I use echo and a pipe in order to write multi-line commands.)
When I do that, it works correctly on those files that do not have any spaces in their names. But the others...
stat: cannot stat ‘(another’: No such file or directory
stat: cannot stat ‘copy)’: No such file or directory
Changing $(ls)
to "$(ls)"
or $file
to "$file"
does not work. What can I do?
Edit:
echo '
for files in *
do
stat "$files"
done' | bash
does the trick! As I'm new to bash, I want to keep things as simple as possible - so nothing with trying to escape spaces, or using xargs
or the solution with read -r
, although they solve the problem.
As some have asked: Yes, using this instead of stat *
is weird. But I just wanted to find a general way to apply the same command on a bunch of file names in bash, using a for loop. So stat
could stand for gzip
, gpg
or rm
.
The multiple quote from the
echo '
is complicating the thing.You can just use:
But also
...and if you want to collect the files and then apply the command (why?) you can go with (but be careful with file containing new lines...(1))
...and if you want hidden files too, just use
* .*
as a pattern, but then remember that.
and..
will be in the set.As an aside, you shouldn't parse
ls
output.(1) but if you have file names with newlines, you somewhat deserve it... ;-)
On a side note: you can split long / complicated commands over multiple lines by adding a space followed by a backslash and hitting Enter everytime you want to start writing into a new line, instead of forking multiple processes by using
echo [...] | bash
; also you should enclose$file
in double quotes, to preventstat
from breaking in case of filenames containing spaces:The problem is that
$(ls)
expands to a list of filenames containing spaces, and the same will happen also with"$(ls)"
.Even solving this problem, this method will still break on filenames containing backslashes and on filenames containing newlines (as pointed out by terdon).
A solution for both problems would be to pipe the output of
find
to awhile
loop runningread -r
so that, at each iteration,read -r
will store one line offind
's output into$file
:Use the good old
find
, works with hidden files, newlines and spaces.or any other instead of
stat
As a R guy, I already found a workaround in R:
I know, it's mad. I wish the output of
ls
would be easier to parse... R can deal with spaces, because dir() returns a quoted character value. Anything between the quotes is then a valid file name with spaces.I have run into other instances of whitespace issues in for loops, so the following (imo more robust) command is what I generally use. It also fits nicely into pipes.
You could combine this with
grep
or usefind
instead:Instead, you can rename your files replacing the space with some other character such as underscore, so you get rid of this problem:
To do that easily run the command:
This answer will solve the problem of parsing
ls
and take care of the backspaces and new linesTry this, it will solve your problem using Internal Field Separator IFS.
But Also you can eaisly solve it without need to parse ls output using