W3C::LinkChecker is the WWW consortium's own link checker (it looks at links in HTML pages and tries to check whether they are OK or not.)
It works but the results are horribly ugly and it's buggy, printing out uninitialized variable warnings to STDERR as it runs.
On the HTML setting the results are really silly and bossy messages ("This link is broken. Fix it NOW!") in a horrible colour scheme of red, yellow and grey.
Other problems: it doesn't seem to have any way to stop repeating the same messages over and over again when links to the same page appear on more than one checked page. In fact I ran it using its "--recursive" option on a set of pages which all contain links to the w3 consortium's CSS/HTML checkers, and for every single page it checked, it printed out exactly the same messages, over and over again, about robots.txt.
As far as checking links goes, it worked to some extent but it doesn't have features which I'd expect in a link checker. Xenu's link sleuth probably works better than this.
It works well, and even makes recommendations about updating links which point to referring websites. Only some minor nitpicking:
* For links to password-protected pages, it says "The link is not public. You'd better specify it." It's not clear from the documentation how one specifies for the robot that the site is password protected.
* It doesn't install through the cpan shell, so must be manually downloaded and installed.