I need ([login to view URL]) to use the crawler to download all the images by category.
The images on the first website are downloaded using the crawler to select the largest size on the image and saved according to the label classification on the image. (to find the picture).(Find pictures). Whenever you download dozens of pages of the site, a verification code appears, you need to insert a codec (or I can purchase the anti-verification plugin). Website spiders can be reused and automatically exclude images that have been downloaded repeatedly. When docking, you need to give me a spider on the site, and all the images downloaded from the site.