Check URL using DOS script

Page 2 of 2 FirstFirst 12

  1. Posts : 1,049
    Windows 7 Pro 32
       #11

    Sorry, that's the best I can do. The problem is that your PC will have to access every URL and wait for the web site to finish loading it until you know if the picture exist or not.

    Normally these kind of things are done with web crawler tools from a server but that's another issue and one I can't help you with.
      My Computer


  2. Posts : 36
    windows7 64 bit
    Thread Starter
       #12

    Thanks Tookeri,

    Can you please suggest the best web crawler tool? how can i achieve this functionality in a better way and faster way
      My Computer


  3. Posts : 10,796
    Microsoft Windows 7 Home Premium 64-bits 7601 Multiprocessor Free Service Pack 1
       #13

    born2achieve said:
    Hey Dude,

    Finally i was able to install the GetGnuwin32 and i did all the installation specified in the document. Now i could see the Wget exe and i tried to

    D:\GnuWin32\GetGnuWin32\bin> wget.exe Error 404 (Not Found)!!1 ... 4530698365

    I could see the result on the command prompt and i can see the image downloaded on the root folder. Now Could you please help me on reading the URL.txt has 200000 urls and need to output which URL has doesn't have image on output.txt.
    then i tried to achieve the ecat output by below code,

    Code:
    @echo off
    (for /f "usebackq delims=" %%a in ("url-list.txt") do (
        "D:\GnuWin32\GetGnuWin32\bin\wget.exe" --spider "%%a" || echo missing %%a
    ))>url.log
    pause
    i am good now. but seems to be this is not fastest way i am looking for as it takes plenty of time to output the result. please guide me with the fastest way to achieve this
    use wget -S --spider

    post results
      My Computer


  4. Posts : 36
    windows7 64 bit
    Thread Starter
       #14

    Hi Kaktusoft,

    Yes, i am using wget -S --spider for this purpose. But it take huge time(hours and hours) and still it didn't finish yet.

    So what is the best way to achieve this . please guide me
      My Computer


  5. Posts : 1,049
    Windows 7 Pro 32
       #15

    Here's a very simple solution that will do it 10 times faster:

    Split the file with URLs to 10 separate files and create 10 versions of the batch file that reads each URL file, and create 10 different output files. Then run all 10 simultaneously. When all have finished you add all output files into one single file.
      My Computer


  6. Posts : 36
    windows7 64 bit
    Thread Starter
       #16

    Hi Tookri,

    good solution. I tried splitting as 10000 of 20 times and i could finish with in 3-4 hours. I like this idea.

    thanks for the great tip. Thanks a lot to everyone who participated and helped me.
      My Computer


  7. Posts : 1,049
    Windows 7 Pro 32
       #17

    Happy to hear it worked out for you :)
      My Computer


 
Page 2 of 2 FirstFirst 12

  Related Discussions
Our Sites
Site Links
About Us
Windows 7 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 7" and related materials are trademarks of Microsoft Corp.

© Designer Media Ltd
All times are GMT -5. The time now is 13:55.
Find Us