Issue with wget trying to get images from certain websites

222 views Asked by At

I am trying to download all images off this website path http://www.samsung.com/sg/consumer/mobile-devices/smartphones/ using the below code

wget -e robots=off -nd -nc -np --recursive -r -p --level=5 --accept jpg,jpeg,png,gif --convert-links -N --limit-rate=200k --wait 1.0 -U 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:14.0) Gecko/20100101 Firefox/14.0.1' -P testing_folder  www.samsung.com/sg/consumer/mobile-devices/smartphones

I would expect to see the images of the phones downloaded to my testing_folder.But all I see is some global images like logo etc. I dont seem to be able to get the phone images downloaded. The code above seems to work on some other websites through.

I have gone through all the wget questions on this forum but this particular issue doesnt seem to have an answer. Can someone help, I am sure there is a easy out. What am I doing wrong ?

UPDATE: It looks like it is an issue with possible javascript pages and hence seems like end of the road, since apparently wget cant handle javascript pages well. If anyone can still help, will be delighted.

1

There are 1 answers

3
Joachim Wagner On

Steps:

  1. configure a proxy server, for example Apache httpd with mod_proxy and mod_http_proxy

  2. visit the page with a web browser that supports JavaScript and is configured to use your proxy server

  3. harvest the URLs from the proxy server log file and put them in a file

Or:

  1. Start Firefox and open web page

  2. F10 - Tools - Page Info - Media - right click - select all - right click - copy

  3. Paste into file with your favourite editor

Then:

  1. optionally, (if you don't want to find out how to get wget read a list of URLs from a file), add minimal html tags (html, body and img) to the file

  2. use wget to download the image specifying the file created in step 3 or 4 as the starting point