4

I'm trying to download all the images of a website

here is the website :

https://wall.alphacoders.com/by_sub_category.php?id=173173&name=Naruto+Wallpapers

I tried:

wget -nd -r -P /home/Pictures/ -A jpeg,jpg,bmp,gif,png https://wall.alphacoders.com/by_sub_category.php?id=173173&name=Naruto+Wallpapers
s

But it doesn't download the images

result

HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] /home/Pictures: Permission denied/home/Pictures/by_sub_category.php?id=173173: No such file or directory

Cannot write to ‘/home/Pictures/by_sub_category.php?id=173173’ (No such file or directory).

dragon
  • 43

1 Answers1

2

To download all images from the specified page with wget you can use this command:

wget -i `wget -qO- https://wall.alphacoders.com/by_sub_category.php\?id\=173173\&name\=Naruto+Wallpapers | sed -n '/<img/s/.*src="\([^"]*\)".*/\1/p'`

In this example HTML file is download with wget to STDOUT, parsed with sed so that only img URL remain and passed to wget -i as an input list for downloading.

Note that it will download only the images on this page, but they are just thumbnails (350px wide).

If you'd like to download full images, you should go a step forward and change the parsed IMG urls so that they correspond the hi-res images. You can do it with sed or awk:

wget -i `wget -qO- https://wall.alphacoders.com/by_sub_category.php\?id\=173173\&name\=Naruto+Wallpapers | sed -n '/<img/s/.*src="\([^"]*\)".*/\1/p' | awk '{gsub("thumb-350-", "");print}'`

The result of running the last command is a pack of HD wallpapers on your disk

enter image description here

x1sn0tz
  • 169