>I'm not sure which category to put this, so I'll just put it in here. My marketing director wants us to examine out "hits page" from our website each day, and write down all the links that directed someone to our site, and what their keywords were that brought our site up. I've got a static web url I go to everyday to get to this page. I'll eventually be able to find a way to parse the source code of the html page thru code to get the information he needs, but for now I'm just wanting to have my form connect to the webpage and download the source code for the page into a memo field in my table. We're on a lan directly connected to the internet, so I won't have to do any fancy dial-up connection stuff. Can anybody give me some pointers on how to do this?
>
>Thanks,
>Bryan
Look at the inetctls that comes with visual studio. It is very easy to download a page's html. I use it to get my long distance phone bill detail on the web. I haven't worked with it in nearly a year, but a sample is available that will allow you to create a form and see it work in a few minutes. For example (because you have a direct Internet connection) you can put the URL in the OpenURL property and call the GetHeader() method. This will give you the HTML of the page instantly which you can put into a edit box on the form or just load into a variable for processing. These 3 lines in a button.click method do all the work on one form:
THISFORM.cHTML = THISFORM.Olecontrol1.OBJECT.OpenURL(ALLTRIM(THISFORM.TxtURL.VALUE))
THISFORM.edtHeaders.VALUE = THISFORM.Olecontrol1.OBJECT.GetHeader()
Thisform.REFRESH()
There is a text box (txtURL) for you to type in the URL and an edit box (edtHeaders) for you to see the HTML. This may help you get started.
Dr. Ken A. McGinnis
Healthcare software design