Subject: | |
From: | |
Reply To: | COLE,GLENN (Non-HP-SantaClara,ex2) |
Date: | Thu, 18 Jan 2001 17:13:45 -0800 |
Content-Type: | text/plain |
Parts/Attachments: |
|
|
Hans writes:
> Are there any tools for verifying URLs that run on the HPe3000?
>
> Lets say you have a number of URLs stored in a database. Is there some way
> to dump them into a flat file and check each URL (possibly in batch).
then Mark Bixby:
> Perl LWP could do this:
>
> #!/PERL/PUB/perl
> use LWP::Simple;
> $doc = get('http://www.bixby.org/mark/perlix.html');
>
> Since this is Perl, there's almost certainly "more than one way to do it".
Indeed, Randal Schwartz provides an annotated example at his site
for a more complex problem: traversing raw HTML files to verify
the links. (Hans is lucky because in his case, no parsing is
needed. :)
http://www.stonehenge.com/merlyn/WebTechniques/col07.html
Also, because Lars is likely asleep (time difference, you know ;)
I should note that David Flanagan's "Java Examples In A Nutshell"
includes techniques both for reading the contents of the URL, and
for just learning more about the URL (pages 84-86 of the 2nd edition).
It's quite simple here as well.
Yet another reason to venture beyond the "traditional" e3000 languages. :)
--Glenn
|
|
|