A bit of script for downloading images from a website
So I wanted to download only images from this one website for archival usage. After looking at all ready available page, I couldn't find anything suiting for my purpose. So I decided to throw in a little shell-script. It went on as follows: The pages are incremental in index. So I can use a while loop to fetch all pages one by one. Done. Need to fetch the page. Wget is fine. Done. Then I need to look for the image URL in retrieved HTML. Hmm. Bit of grep with cut does that. Done. Next get the actual image. Again prepare the image URL and wget. Done. Error handling? Gaah. Nothing since this is not that trivial. After testing for few pages it was golden. So I put it up with counter loop of hundred images at one time. That's because this script is kind of slow since I don't know nor care to put in multithreading in shellscript. Gaah. Plus as I later found out having 100's counter is good since wget hung out when the net went down for a moment, and I had to restart the scrip...