Short: Recursively get WWW documents to local disk Author: markie@dds.nl Uploader: markie dds nl Type: comm/tcp Architecture: m68k-amigaos AbGetUrl can be used to download WWW documents to local disk for offline browsing or archival purposes. This version uses multitasking to get x (default 8) documents at once. When the downloading is done the documents can be converted to be relative to the local disk. USAGE: ABGETURL URL TO/K,DEPTH/N,LINKS/N,LOCAL/S,NOFIX/S,HEADER/S, GET/S,NOINDEX/S,RELOAD/S,RELOADHTML/S TO Destination directory DEPTH Recursion depth LINKS Number of links to open simulataneously LOCAL Stay on local site NOFIX Do not fix links in urls HEADER Save http-headers to disk GET Just get one url to file (Why is this ?) NOINDEX Do not create index file RELOAD Reload all docs RELOADHTML Reload all html docs This program is giftware; if you like it and use it you should send the author a gift. ( ZIP drive / JAZZ drive / 060 / etc. ;-) Disclaimer: use this program at your own risk.