Bash: executing concatenated variables

1

I have a Linux bash script, to the effect of:

COMMAND_TO_EXECUTE="foo "$1" --option1 --option2 "$2
exec $COMMAND_TO_EXECUTE

Problem: the script fails

Debugging clue, If I echo $COMMAND_TO_EXECUTEthen cut/paste echo that into a terminal window, it works perfectly.

So.. The string behind $COMMAND_TO_EXECUTE is valid in a terminal, but not valid in a script. Is there something I should do to $COMMAND_TO_EXECUTE before trying to executing it?

Amplifying info: the commands to execute have been wget or curl, I have the same problems with both. (I have properly quoted strings and escaped characters like &). As mentioned before.. the command works fine if I echo, then cut/paste it.

I am puzzled and feel I am missing something elementary because the command works cut/pasted but not in script.

UPDATE: Polyergic's reply below worked for me. bash -c "$COMMAND_TO_EXECUTE"runs properly.

Paulb

Posted 2014-12-12T20:10:07.517

Reputation: 737

As a side note, why do you have the variables outside the double quotes? – ShadSterling – 2014-12-14T02:14:34.200

The quotes delimit the text that I intended to concatenate the variables to. – Paulb – 2014-12-14T08:41:26.097

Thinking of it in terms of concatenation doesn't really match what Bash is doing. It's not like "real programming languages" where you store values and use operators, it's just manipulating that line of text. (I'd call it a string, but it's misleading to think of it as a string in the sense of other languages.) In short, it's more idiomatic and marginally better to do the variable expansion inside the doublequoted string. – ShadSterling – 2014-12-14T18:41:59.480

Answers

1

The idiom I've used is this:

cmd="some command in a string"
"$cmd"

Note that unlike using exec, the calling script will continue.

Some modification may be required if your command includes special characters.

Here's another variation I've used (I don't remember why I did it this way):

cmd="some command in a string"
bash -c "$cmd"

Here's a complete example script, which I use to surpress expected output from cronjobs:

#!/bin/sh

# otrap
# Copyright © 2007-2014 Shad Sterling <me@shadsterling.com>
# Released under CC-BY-SA, http://creativecommons.org/licenses/by-sa/4.0/

if [ "" = "$2" ]; then
    echo traps the output of a command and echos the output only if the size doesn\'t match
    echo USAGE: $0 \<size\> [\"test\"] \<command\>
    echo size is the expected output size
    echo \"test\" means echo the output even if the size does match
    echo command is the command to silence unless it\'s output doesn\'t match 
    exit 2
fi;

test=false
size=$1; shift
echo=false
if [ "test" == "$1" ]; then
test=true; shift
echo=true
fi
cmd="$*"

TF=~/tmp/otrap.#$PPID.log
if [ "false" != "$echo" ]; then
    echo file: "$TF"
    echo running: "$cmd"
fi
starttime=`date +%R`
$SHELL -c "$cmd" > $TF 2>&1
ret=$?
endtime=`date +%R`
ST=`\`dirname $0\`/filesize $TF 2>&1`
if [ "$size" != "$ST" ]; then
    echo=true;
fi;
if [ "false" != "$echo" ]; then
    echo " command:" "$cmd"
    echo "   start:" $starttime
    echo "  finish:" $endtime
    echo "returned:" $ret
    echo "    size:" $ST
    echo "expected:" $size
    echo --------------------------------------------
    cat $TF
fi
rm $TF
exit $ret

ShadSterling

Posted 2014-12-12T20:10:07.517

Reputation: 1 111

bash -c "$cmd" worked for me. I never thought the string would need further processing before invoking (the -c option). But it works.. – Paulb – 2014-12-13T09:13:56.047

I'd recommend against using bash -c for this sort of thing; it's essentially eval with the added complication that it runs the command under a subshell. eval has a well-deserved reputation as a source of bug. What eval and bash -c do, essentially, is add an extra layer of shell parsing/preprocessing to the command, and if you don't understand exactly how shell parsing works... adding more of it just adds more of something you don't quite understand. – Gordon Davisson – 2014-12-13T22:34:12.087

I don't know what sort of thing the OP is using it for; in my script, working like eval is the point. There aren't many cases where I'd want to do this, but I think there are a few cases where it's appropriate. What would you use instead? – ShadSterling – 2014-12-14T02:11:02.260

@Polyergic: it really depends on how complex the command is (i.e. just a command and its arguments, or a compound command, or something with redirects, or...) and why it's being put in a variable rather than executed directly. There are number of options, and in some cases eval is the best one -- but not in very many cases. – Gordon Davisson – 2014-12-15T06:43:57.357

3

Short answer: See BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!.

Long answer: When the shell parses a command line, it does it does things like figuring out which portions of the line are in quotes (or escaped or whatever) before it substitutes variables; thus, if you have any quotes or escapes inside the variables' values, by the time they're substituted into the command it's too late for them to do anything. It does do a little bit of parsing on the values of substituted variables: it splits them into "words" based on spaces, tabs, etc (no matter whether they're in quotes), and it expands wildcards (again, even if they're in quotes). BTW, it's also too late for other bits of shell syntax like pipes, redirects, etc to take effect. To illustrate this, I have a command printargs that prints its arguments. Here's what happens when I try to store a complex command in a variable:

$ cmd='printargs " * " | cat >outfile &'
$ $cmd
Got 8 arguments:
    '"'
    'file1.txt'
    'file2.txt'
    '"'
    '|'
    'cat'
    '>outfile'
    '&'

Note that the quotes, pipe, etc are all treated as normal characters not shell syntax, but that the asterisk got treated as a wildcard and replaced by a list of filenames.

There are several solutions, depending on why you put the command in a variable in the first place:

  • If you don't really need to put the command in a variable, don't. Commands are meant to be executed, so unless there's a good reason not to, just execute it directly.

  • If you want to use essentially the same command several times & don't want to have to write the whole thing out every time (the "don't repeat yourself" rule of programming), use a function:

    execute_command() {
        foo "$1" --option1 --option2 "$2"
    }
    

    ...and then call it repeatedly. Note that I put the variable references in double-quotes; you should (almost) always do this to prevent them having word splitting and wildcard expansion applied to them.

  • If you need to build the command dynamically, use an array:

    post_data=("var1=value1" "var2=value2" ...)
    
    post_args=()
    for post_arg in "${post_data{@]}"; do   # Note that this is the correct idiom for expanding an array in bash
        post_data+=(-d "$post_arg")
    done
    curl "${post_args[@]}" "$url"
    

Note that this works for complex arguments for a single command, but won't work for things like pipes, redirects, and backgrounding (&), because again those are parsed before variables get substituted.

Finally, some warnings:

  • Don't use eval or bash -c unless you know exactly how shell parsing works (and if you're asking this question, you don't know exactly how shell parsing works). Both of these cause an extra layer of parsing to happen, which tends to work great in testing but fail occasionally (for incomprehensible reasons). eval has a well-deserved reputation as a source of really strange and subtle bugs; and bash -c does essentially the same thing, just with a subshell thrown in to make it even weirder.

  • Double-quote variable references to prevent unexpected word splitting and wildcard expansion.

  • You probably don't want to use exec -- it exits the current shell (& shell script), and replaces it with the command; this it probably not what you intended.

Gordon Davisson

Posted 2014-12-12T20:10:07.517

Reputation: 28 538

I agree with most of what you said, but I'm not sure the link about complex cases is relevant. The OP seemed to be asking about a very simple case. – ShadSterling – 2014-12-14T02:13:22.790

1@Polyergic: The example given isn't complex (well, depending on what $1 and $2 are), but he mentions "properly quoted strings and escaped characters like &". Also, if the real command involved were that simple, exec $COMMAND_TO_EXECUTE would work (except for the exec part). – Gordon Davisson – 2014-12-14T03:25:52.877

0

As you have typed it COMMAND-TO-EXECUTE is not a valid shell variable name, remove the dashes. This might give some clues:

$ echo $COMMAND-TO-EXECUTE
-TO-EXECUTE

$ COMMAND-TO-EXECUTE=Test
COMMAND-TO-EXECUTE=Test: command not found

$ COMMAND_TO_EXECUTE=Test

$ echo $COMMAND_TO_EXECUTE
Test

$ 

Then "foo "$1" --option1 --option2 "$2 looks a bit odd. If you want COMMAND_TO_EXECUTE to contain "-characters, change this to "foo \"$1\" --option1 --option2 $2" - also note that I moved the quote at the end. By this you "safeguard" $2 to not cause strange effects before you're leaving it to exec.

As you have not given an actual example command to try out / debug I can't think of more...

Hannu

Posted 2014-12-12T20:10:07.517

Reputation: 4 950

As I see it the string assigned to the variable would not contain any quotes (unless $1/$2 contain them): it is exactly equivalent to "foo $1 --option1 --option2 $2" unless $2 contains special characters. The difference that echo/copy/paste makes is that shell expansion is done when the pasted command is executed, but it isn't when the variable contents are executed (try CMD="echo a; echo b" and $CMD to see the difference). – AFH – 2014-12-12T23:26:48.417

Good catch on the underscore versus dash. My mistake in transposing the code to the internet, sorry. I edited. – Paulb – 2014-12-13T09:20:04.777