Seems I saw somewhere where one could get to a website via an HTTP:// URL (as opposed to FTP) and issue a DIR *.* command (or something like it) and get the directory of the URL in an array that could be searched. Then we could selectively use an api to download selected files in the dir from the site.
Any direction to the answer to this is appreciated. (The WWIPStuff libs seem to deal only with FTP sites)
TIA
Bill
CySolutions, Medical Information Technology
You're only as good as your last
success, so . . .If it works. . .don't fix it!