WinUtils - CheckLinks
Checklinks - Checks the links on a web-site, or set of HTML files.
CheckLinks [-bdhrvVw] url... [ > ReportOutput.txt ]
|url||The URL to check links on. This should either be in the form http://www.site.com/webpage.html, or the name of HTML files to check (usually *.html).|
|-r||Recurse all sub-directories. This is only relevant when a file spec. is specified, and is ignored if the URL is a http://....|
|-d||Dry - prints out all URLs (but doesn't check them).|
|-b||Report only broken links.|
|-w||'Webify' the output - prints an HTML style report.|
|-h||Displays the usage.|
|-v||Verbose. This output is directed to STDERR.|
|-V||Display version number.|
Produces a report on all the links from the given web pages to standard output. If the -w (HTML output) option is used, then bad links are highlighted in red.
If a URL starts with "http://", then it this spiders out the entire site (or as much of it as it can reach via links followed recursively from the starting page), checking all HREF and SRC attributes to make sure they work using GET and HEAD requests from LWP::UserAgent.
If the URL is a file (or file spec.), then the search is not recursive by default, but only checks the specified files.
C:\MyWebsite\>CheckLinks -wr *.htm* > LinksReport.html
This will check all the HTML files in the directory C:\MyWebsite and generate an HTML report called LinksReport.html.
C:\tmp\>CheckLinks -w http://www.mysite.com > LinksReport.html
This will check all the links on the web-page www.mysite.com and generate an HTML report called LinksReport.html.
The utility also uses the Perl modules, some or all of which may be distributed with the latest version of Perl. If not, you can download them from CPAN (Comprehensive Perl Archive Network).
The utility is distributed as a Perl source code file, but you can convert it to a DOS batch file using Perl's pl2bat utility.