このページは大阪弁化フィルタによって翻訳生成されたんですわ。

翻訳前ページへ


Parallel bad links! (Jul 98)

Copyright Notice

This text is copyright by CMP Media, LLC, and is used with their permission. Further distribution or use is not permitted.

This text has appeared in an edited form in WebTechniques magazine. However, the version you are reading here is as the author originally submitted the article for publication, not after their editors applied their creativity.

Please read all the information in the table of contents before using this article.
Download this listing!

Web Techniques Column 27 (Jul 1998)

Roughly two years ago in this column, I took a look at a basic ``link verifier'' script, using off-the-shelf LWP technology to parse HTML, locate the outward links, and recursively descend the web tree looking for bad links. Little did I know the interest I would stir -- it's become one of the most frequently referenced and downloaded script of all my columns! Last year, I updated the script, adding a forward-backward line-number cross-reference to the listing. But this year, I've got something even cooler!

Just recently, the new ``parallel user agent'' has come into a fairly stable implementation. This user agent works like the normal LWP user agent, but allows me to register a number of requests to be performed in parallel within a single process. This means I can scan a website in a small fraction of the time! I decided that this year's annual update to the link checker was to make it parallel.

As always, the program you're seeing here is not intended as a ``ready to run'' script, but it is in fact useful enough as=is that I'm using it to verify my website. The program is given in [Listing one, below].erify my website. The program is given in [Listing one, <<(5)EUC文字列でないので変換できません!>>