39

According to the man page, xargs will quit if one of the execution lines exits with an error of 255:

If any invocation of the command exits with a status of 255, xargs will stop immediately without reading any further input. An error message is issued on stderr when this happens.

How can I get xargs to not do this?

I have a 1500 or so line batch job that I want to run, 50 lines at a time. I was finding that it was always dying at a certain line, and not completing the job. Not good!

An even better question, the question describing what I am trying to do, is:

How can I run a 1500 line batch script, 50 lines at a time, so that it does not quit the job in the middle, and so that the output is captured to a log file of some kind?

JDS
  • 2,508
  • 4
  • 29
  • 48

6 Answers6

38

Similar to larsks answer but more explicit:

xargs sh -c "somecommand || true"
Filippo Vitale
  • 483
  • 1
  • 5
  • 8
13

You could wrap the perl script with another simple bash script:

#!/bin/bash
real-command "$@" || exit 0

This will call real-command passing it all the parameters that you pass to this fake-command and it will always return a 0 exit code (that means it is always successful) and xargs will never stop with this.

user842313
  • 831
  • 4
  • 5
11

Just found a fun answer to this one, though its usefulness will depend on the command you're trying to run.

If you're using xargs to basically assemble a list of commands, you can get this behavior by telling xargs to echo the command, then piping to bash.

For example, if you're trying to delete a list of things that may or may not exist:

# presume this will fail in a similar way to your command
cat things_to_delete | xargs -n1 delete_command_that_might_exit

# instead echo the commands and pipe to bash
cat things_to_delete | xargs -n1 echo delete_command_that_might_exit | bash

This works because, first, xargs is only ever calling echo, so it won't see any errors. Then second, because bash's default behavior to continue execution after a failed statement.

To be more specific about my case, I was using this to remove a bunch of old application versions from AWS ElasticBeanstalk like so:

aws elasticbeanstalk describe-application-versions --application-name myapp |\
jq -r '.ApplicationVersions | sort_by(.DateCreated) | .[0:-10] | .[].VersionLabel' |\
xargs -n1 \
  echo aws elasticbeanstalk delete-application-version \
       --delete-source-bundle --application-name myapp --version-label |\
bash
matschaffer
  • 329
  • 3
  • 4
  • 2
    I'll avoid it because piping to bash looks elegant until you try to cancel with CTRL+C. I had to cancel once for every single command that was remaining to be processed. Is there something I'm missing to be able to cancel at the xargs level rather than the bash level? – Iain Samuel McLean Elder Mar 04 '20 at 16:22
  • I suspect you'd need to kill the top level process (probably the terminal shell) that started the pipe for that. – matschaffer Mar 10 '20 at 03:10
10

You could write your xargs invocation to mask the return codes of your command lines. With something like the following,xargs will never see exit codes return by somecommand:

xargs sh -c "somecommand || :"
larsks
  • 41,276
  • 13
  • 117
  • 170
  • I've come up with a good solution: make sure the commands being processed do not exit with a 255 status! **Additional Details** The command being processed is a Perl script. The Perl die() function was being used in several places to exit out if some critical error occurred (e.g. could not connect to a database). However, die() always exits with error status 255. The solution in this case was to replace die() with a combination of print and exit(), along with a more reasonable error code ("1" worked in this case). – JDS Jul 11 '11 at 16:19
6

Following construction works for me:

ls | xargs -I % svn upgrade %

Even if svn upgrade failed on some element, process was continued

AndreyP
  • 231
  • 3
  • 3
5

If you were using xargs with find, use the -exec option of find instead:

find . -name '*.log' -exec somecommand {} \;
Roger Dahl
  • 346
  • 3
  • 6
  • 1
    howdy. i could use that but the -exec option doesn't parallelize operations the way using xargs can and does – JDS Oct 23 '14 at 14:52
  • 3
    Thank you -- I didn't know that `xargs` could run commands in parallel. Cool. If you only want to minimize the number of command invocations, `-exec` has a `+` parameter. – Roger Dahl Oct 23 '14 at 15:05