BASH function not escaping control characters

4

I have a function that I'm using to find stuff, but unfortunately anytime I pass it a control character ($intVal or testing : etc) it chokes. I was wondering what the fix was?

I can understand that using $ or % or : etc in grep without escaping causes this issue, but since I'm passing it in by reference I'm not sure how to escape it...

Anyway, here's the code.

function ffind()
{
     if [ $1 ] ; then
         find -type f | grep -ir '$1' * | grep -v '.svn'
     else
         echo "'$1' is not a valid resource"
     fi
}

Example(s):

$ ffind $intVal
'' is not a valid resource

$ ffind "testing :"
bash: [: testing: unary operator expected
'testing :' is not a valid resource

ehime

Posted 2012-08-31T19:14:27.050

Reputation: 241

1have you tried changing the single quote for double? "$1" instead of '$1'. Those are not the same in bash. – mnmnc – 2012-08-31T19:55:13.363

Answers

9

First example

$ ffind $intVal
'' is not a valid resource

This does not work because $foo is the syntax for variables, and you don't have a variable named intVal set, so $intVal is translated to an empty string. Since the variable is also unquoted, no arguments at all are passed to ffind.

To fix this, escape the $ – either \$intVal (backslash) or '$intVal' (single-quoted).

If you actually have a variable named intVal, though, put it in double quotes instead – "$intVal" – this will expand the variable's value, but will not split it.

Note that there is no such thing as "pass by reference" in bash. There is only pass-by-value and (tricky) pass-by-name.

Second example

$ ffind "testing :"
bash: [: testing: unary operator expected
'testing :' is not a valid resource

This does not work because you forgot to put quotes around $1 in the if [ $1 ] line, therefore it is subject to word-splitting, and three arguments are passed to the [ builtin:

  • "["
  • "testing"
  • ":"
  • "]"

instead of the expected two:

  • "["
  • "testing :"
  • "]"

Example #1 is also affected by this, since [ $1 ] splits to ("[", "]") and not ("[", "", "]"). Example #1 works by accident, though, since apparently [ ] is valid. (I didn't know that...)

To fix this problem, put double quotes around $1[ "$1" ].

Note: While [ is standard, there also is a bash-specific [[ operator, which actually has different parsing rules from the rest of the code – in particular, it does not split expanded variables. In bash, [[ $1 ]] and [[ "$1" ]] are both equivalent, unlike their [ alternatives.

More bugs

Your function also has several other problems that are not shown in the examples.

find -type f | grep -ir '$1' * | grep -v '.svn'

In this line:

  1. The word '$1' has single quotes around it. This means that bash will not expand its contents – you're actually telling grep to search for the regex $1, and not for the command-line argument.

    To fix this, use double quotes instead – "$1"

  2. The first grep command is being told to recursively search the contents of all files in the current directory (-r and the * wildcard).

    At the same time, you are piping the output of find -type f into grep – seemingly trying to tell grep to search the names of all files.

    This won't work because grep, like most filters, will not read from stdin if given one or more files to search. I don't know what you are trying to search – file names or file contents – so pick one:

    • To search just file names, keep the pipe but remove the file specification:

       find -type f | grep -i "$1" | ...
      
    • To search just file contents, remove the find| instead:

       grep -ir "$1" * | ...
      
    • It is possible to combine both, by explicitly giving grep the "stdin" file:

       find -type f | grep -i "$1" - * | ...
      
       find -type f | grep -i "$1" /dev/stdin * | ...
      

      (/dev/stdin works with all Linux programs, while - is a convention used by some programs, including grep.)

  3. In the second grep command, the search regex is a bit too broad. (Remember that it is a regex, not a fixed string.) .svn would match even something like "not-a-svn-file".

    To exclude the .svn directory, use grep -v "/\.svn/" instead.

    If searching file content (grep -ir ...), it is even better to get rid of the grep -v command entirely, and add --exclude-dir=".svn" to the first one.

You can stop reading at this point

Items below are just good practice in sh scripts.

  1. The function keyword is unnecessary: ffind() { ... is sufficient, and works in all POSIX shells (while function ffind would not).

  2. If a script, program, or function fails, it should return a "failure" status to its parent. By convention, Unix programs consider 0 to mean "success" and anything else "failure" (though there are exceptions to the latter).

    To return a status explicitly, use return <num> in a function (or exit <num> in a standalone script):

    else
        echo "'$1' is not a valid resource" >&2
        return 1
    fi
    
  3. Similarly, error messages should not be mixed with normal stdout, but written to stderr (fd #2) instead, using the >& redirection operator (see the above example). This way you can redirect the normal output to a file (e.g. ffind intVal > results.txt) while still having errors displayed on screen.

Fixed code

ffind()
{
     if [ "$1" ] ; then
         grep -ir --exclude-dir=".svn" "$1" .
     else
         echo "'$1' is not a valid resource" >&2
         return 1
     fi
}

Better tools

ack claims to be "better than grep". Running ack "testing :" would search your source code, and automatically skip .svn and similar directories.

user1686

Posted 2012-08-31T19:14:27.050

Reputation: 283 655