6

This question is similar to Download files from a list but it is not the same as I want to go beyond this. Please do not close as a duplicate.

I have a list of files to download in a text file foo.txt. There is a directory foo which I'm in, and I use wget -i ../foo.txt to pull the files down. That's all fine and dandy.

However, what I need to do is download the files in there with a specific type of naming pattern. It's pulling down json files that are being spit out by API calls to a specific server, but I need it to be saved in a specific pattern such as foo_1.json, and because it's not actually linking to a foo_1.json file it won't save it as such.

Is there any way to expand upon the wget functionality to save files it downloads in a specific filename pattern, say, sequentially?

If not, alternative CLI methods not using wget are acceptable.

Thomas Ward
  • 74,764
  • If wget is too unwieldy for this, please feel free to suggest alternative applications and methods. – Thomas Ward Feb 11 '15 at 00:16
  • What are the filenames that are being downloaded now? – jkd Feb 11 '15 at 00:30
  • @jakekimds they're being saved as the part after the first / in the API query on the URL - it's not saving as files because it's returned as text not as downloadable JSON files – Thomas Ward Feb 11 '15 at 00:32

1 Answers1

6

You could always use a shell loop:

while read url; do wget -O foo_$((++i)).json "$url"; done < list.txt

The file list.txt is your list of URLs, one per line. This is read by the while loop, each URL is saved as $url, wget downloads it and the -O flag tells it to save to a file called foo_N.json where N is the current value of $i. Since $i is incremented in each iteration ($((++i))), this will result in a increasing sequence of file names.

terdon
  • 100,812