NAME

  rgetlinks - A small program to recursively list hyperlinks in web pages
  version 0.01

USAGE SUMMARY

  rgetlinks [depth] URL
  rgetlinks --depth=3 http://www.perl.org

ABSTRACT

  This program follows hyperlinks in web pages and lists them recursively.
  Links are ouput with indentation showing their relative depth in the crawl.
  This program follows all links in the target document to the depth specified.

DESCRIPTION

  This program was written to facilitate the automated download of web
  content. It is best used in conjunction with tools like grep and lwp-download.
  For instance, one can recursively crawl a web site with this program redirecting 
  the output to a text file. By using grep on the resultant file, one can easily
  create a shell script or batch file containing a series of lwp-dowload commands.


SEE ALSO

  Other programs are being written to automate the second and third steps in the
  above procedure.


BUGS

  Please report any bugs to brent_hughes_[at]hotmail[dot]com.
  Also, feel free to comment on the program's coding or function.
  Also, let me know what other kinds of features you would like to see.


AUTHOR

  Brent Hughes
  brent_hughes_[at]hotmail[dot]com

COPYRIGHT AND LICENSE

  Copyright 2003 by Brent Hughes
  This program is free software. Do whatever you want with it.