>>Seems I saw somewhere where one could get to a website via an HTTP:// URL (as opposed to FTP) and issue a DIR *.* command (or something like it) and get the directory of the URL in an array that could be searched. Then we could selectively use an api to download selected files in the dir from the site.
>>
>>Any direction to the answer to this is appreciated. (The WWIPStuff libs seem to deal only with FTP sites)
>>
>
>Most folders published on the web do not allow browse access. If you have control over the site, just create an FTP site in the same folder.
The sites that we are working with allow us to retrieve a directory of the http: location. Can the FTP command to retrieve a directory be used against an HTTP site?
Thx,
Bill
CySolutions, Medical Information Technology
You're only as good as your last
success, so . . .If it works. . .don't fix it!