How can I continue a loop in bash after an application returned an error?

3

I save images with:

#!/bin/bash
for i in {1..30000}
do
    wget "http://services.runescape.com/m=itemdb_rs/3809_obj_sprite.gif?id="$i
done

Is there a way to speed up the process? Sometimes a URL does not exist and I get:

connected.
HTTP request sent, awaiting response... 404 Not found
2012-08-04 18:09:36 ERROR 404: Not found.

How can I continue after this error?

Szymon Toda

Posted 2012-08-04T16:17:24.130

Reputation: 1 239

If you have an additional question about how to download out only certain object, please ask an additional question. Mixing two subjects in one question just over-complicates the matter :) – Der Hochstapler – 2012-08-04T16:26:52.757

@OliverSalzburg - Your comment is spot on. But you chose to edit the title to reflect the question you answered. Neglecting to notice that there where two answers answering his other question. – Nifle – 2012-08-04T16:37:04.010

@Nifle: Feel free to read the history of the question and change it if you feel that I've missed the point with my edit. Also, please note that I edited and commented on the question before I wrote my answer. – Der Hochstapler – 2012-08-04T16:39:47.423

Answers

8

Try

wget "http://s.../m=itemdb_rs/3809_obj_sprite.gif?id="$i || true

This way, if wget fails, the result of that line is still zero and your script continues.

To learn how to properly handle errors in a bash script, check out the following resources:

Der Hochstapler

Posted 2012-08-04T16:17:24.130

Reputation: 77 228

1

You could try something like this.

#!/bin/bash
for i in {1..30000};
do 
x="http://64.79.147.130/m=itemdb_rs/3809_obj_sprite.gif?id="$i
if [[ $(echo $(GET -sd $x | grep "404")) = "" ]]; 
then 
wget --no-dns-cache -nc -U"Firefox/10" $x 2>&1 \
| grep "Saving" | sed 's,Saving to,Saved,'; 
fi 
done

tao

Posted 2012-08-04T16:17:24.130

Reputation: 1 355