The problem is not with $file
, it's with your usage of $(find…)
.
Both regular variables and $(…)
process substitutions are subject to exactly the same kind of word-splitting. If you put $(find…)
in double quotes, you get a single word with all output. If you don't, it will be split up at any whitespace – not just newlines or some other magical boundary that you probably expected.
An important part of the above is that backslash escapes are not processed when expanding the variable. It's just split at spaces and that's it.
(In other words, quoting $file
didn't help because it never had the correct value in the first place!)
If you want to process all files recursively, your options are:
Read output from find
in another way:
find … -print | while read -r file; do
echo "Got $file"
done
(Side note: File names may technically include newlines as well, so you might want to guard against that, as rare as it is:
find … -print0 | while read -r -d "" file; do
echo "Got $file"
done
For small amounts of files, use bash's extended wildcards:
shopt -s globstar
for file in /path/to/foo/**; do
echo "Got $file"
done
With large directories, the advantage of find | while read
is that it's streamed – your script will process results as they come. Meanwhile, both $(find)
and wildcards have to collect everything into memory first, and only then return the (possibly massive) results.
On the other hand, using a pipe affects the entire while
loop (not just the read
command), so if you want to run anything that wants to read keyboard input, you have to manually provide it with the original stdin, e.g. vim "$file" < /dev/tty
.
2Here's a rule for Linux Life: If you have trouble doing it in Bash, stop doing it in Bash and use something else.... Bash is a terrible language. Use Perl or something. – Ben – 2017-08-24T10:13:01.733
Also see: https://superuser.com/a/117600/334516
– muru – 2017-08-24T11:10:41.533Also see: Why is looping over find's output bad practice?
– G-Man Says 'Reinstate Monica' – 2017-09-12T05:30:58.943