How to automatically save a list of files with different names?

0

I have a text file of URLs, each in the form of:

http://domain.com/a/b/c/d.jpg

I want to download all of these, but save each file under the name:

c_d.jpg

In other words, for each file, I want to save the file under its original filename prefixed by the name of its parent directory.

How would I go about doing this on Windows?

I'm fine with using a command line tool, such as wget or curl, just give me the arguments.

Thanks.

user294732

Posted 2014-07-08T22:59:18.053

Reputation: 146

That is completely unhelpful. I know these tools exist. I'm asking how to get any one of them to do what I described. – user294732 – 2014-07-08T23:57:30.897

Answers

0

Not sure how to make it in a pure windows environment, but in a cygwin environment, you could try this: (requires bash, sed, wget)

while read link; do a=`echo $link | sed 's/.*\/\(.*\)\/\(.*\)/wget \0 -O \1_\2/'`; echo $a; $($a); done < links.txt

where links.txt is your file.

Of course you can tweak the sed expression to convert the link to a filename in anyway.

Cheers

loluengo

Posted 2014-07-08T22:59:18.053

Reputation: 111