How to generate a list of URLs in bash?

0

How can I generate a list of URLs in bash so that I can pipe them to xargs and then to curl?

http://somewebsite.com/{}.file

where {} is a number.

Testr

Posted 2016-11-26T10:40:53.217

Reputation: 21

echo http://somewebsite.com/{1..100}.file – Ipor Sircer – 2016-11-26T11:21:31.013

@IporSircer Wow that simple. But what about a new line? – Testr – 2016-11-26T11:30:02.710

I found a solution using seq. – Testr – 2016-11-26T11:49:40.050

@Testr, good. Now, you can answer yourself here. – nik – 2016-11-26T13:15:22.367

Answers

3

As Ipor Sircer said, you can use echo http://somewebsite.com/{1..100}.file | xargs .... If you want newlines between entries (which doesn't matter for xargs), use printf '%s\n' http://somewebsite.com/{1..100}.file. But for something like what you're describing, a for loop might be better:

for url in http://somewebsite.com/{1..100}.file; do
    curl "$url"
done

That way if you need any additional per-file scripting (which I often do with things like this), you can write it directly, rather than having to figure out how to embed it in an xargs target.

Gordon Davisson

Posted 2016-11-26T10:40:53.217

Reputation: 28 538