I am developing a web application with a VFP front end and a hosted LAMP server. The vast majority of each web page is generated by PHP code.
Since all the current data on the site is only for testing and not for sale I have blocked the site from search engine crawlers. (robots.txt)
My question is: Do search engines fire off each page the same as a browser and thus run the PHP code that will create all the product links? Meta Tags, etc. Or, do I need to generate full static pages or somewhere in between? I can't seem to find any definite answers.
Since the site has to be blocked for the next few weeks I can't use anything like Google Webmaster tools and I would really like to know what direction I should be going.
Beer is proof that God loves man, and wants him to be happy. - Benjamin Franklin
John J. Henn