Style sheet used wisely
Changing the look of your application, as far as display goes, shouldn't require any update in your code. Whatever relates to the formatting of the HTML page should be manageable outside of your application. Font, table background, table cell font, table padding, table cell padding, text colour and other related formatting should always be controlled by the use of styles. In your code, always make reference to aclass when in need of applying a layout. Even if your style sheet is not ready, just put the name of the class which will be defined later. This then allows you to update the layout at any time and on the client's request if a need is expressed when delivering the application.
document.write("<link rel='stylesheet' href='/Stylesheet.css' type='text/css'>")
document.write("<link rel='stylesheet' href='/Netscape.css' type='text/css'>")
Auto generation of specific pages
In most of the applications, there are always some pages that are generic to all users. By that, I mean that no matter the profile of the user, the page will always be returned the same. In many cases, it would be overkill to go through your Web application to generate this page on request. Basically, the Web site could include a simple link to an ASP or HTML page, for example, when the user wishes to obtain that page. This will provide more power to your Web application, as it won't have to deal with those issues. It will also allow a faster retrieval of the page to the user, as only the Web server will negotiate with the request.
In order to keep the related pages up to date, you will probably make use of a robot application, which runs on a specific server or PC, which will be responsible to generate the related pages at interval. In all Web applications I have delivered, there is always a robot application running on a specific server. This application is responsible for executing a specific set of tasks at specific intervals. In this case, it would be responsible to generate those types of pages every 10 minutes, for example. I will cover other uses of such an application later on in this article.
In an entire site I have developed recently, most of the pages accessed by the users are dynamic. However, for 95% of all those pages, there is no request done to the Web application. All that is generated by the robot application every minute. The time to generate the entire set of pages takes about 1.5 seconds and most of the tables are small. So, this doesn't overload the data server and also provides a great advantage which is the fact that those pages can then be indexed by search engines. But, this implementation is to be used accordingly to the local requirements. In most of the applications, you will be able to make use of some of that.
Rendering of images
Ever remember when you access a Web site and that you have the impression that the layout is moving from left to right and vice versa are HTML rendering time? Or, mostly known as "jumping" when the page loads. This is caused by the fact that the code in that page, where the IMG tags apply, doesn't contain the WIDTH and HEIGHT of the images. Or, it only takes one image tag to be defined without the WIDTH and HEIGHT and you will face that situation.
More and more now developers are taking care of that. However, there are some considerations to take. If you hardcode this in your application, that means that whenever the image size will change, you will face the situation you will have to update your code. As this is in regards to the layout, this is something you would like to avoid.
In most sites I have developed, there is usually a graphic artist which is handling those images. So, if he decides to change the size of the image, you would prefer he doesn't have to rely on you every time this situation occurs. You may make use of the following code when inserting an image in the HTML you are returning:
lcHtml=lcHtml+'<IMG SRC=Test.jpg WIDTH='+ALLTRIM(STR(loImage.Width))+;
* Return an object to get the image dimensions
* expC1 Full path of the file
A robot applications works for you
As mentioned in the upper part of this article, I always have a robot application in all my Web application. This application is responsible for executing specific tasks at specific interval about content specific to your Web application. This application is responsible for executing specific request that I wish to offload from the main server and for other requests that may takes a certain time to be executed. As I focus to have the Web application to response to each user's request in less than one second, I don't want to face the situation that a specific request would take 10 seconds to be executed. Thus, one good advantage of being able to offload that task to another application.
The first issue that comes to mind is the email delivery. Most of the Web applications I have done are generated a lot of emails that need to be sent on a daily basis. That includes confirmation of account creation, renewal notices, invoices, receipts, personal content in regards to a user, etc. All those are initiated by a specific request to user is doing on your Web application. However, this doesn't mean that the same process should be responsible for sending emails. I always trained developers to only initiate the request of sending the email from the Web application and rely on the robot application to send it.
The concept of sending emails includes a variety of sub topics such as archiving everything, logging and apply some status where applicable. I always include an email table in all my application. That table contains the necessary fields for sending an email. When comes time to send an email, I only invoke the AddEmail() function like this, for example:
* Add an email
* expC1 Address of the sender
* expC2 Message
* expC3 Subject
* expC4 Name of the sender
* expC5 Email
* expC6 Name
* expC7 CC email
* expC8 CC name
* expC9 Files
I usually have several levels of processes in such an application. When being set to 60 seconds, it executes within interval the related tasks that would fit within that timeframe. For other tasks that do not require such a fast process, you may define them to be executed at every 10 minutes. I always keep a daily set of tasks as well which are executing at midnight. This provides a lot of flexibilities to such an application and some great advantages to your Web applications.
The first process is always the longest
In all applications, there is always a set of properties, objects and such that are to be loaded at once. The same applies for a Web application. Once being done, this usually takes a few extra seconds to be executed as oppose to any other regular request your application will respond to. As you don't want to execute that initialization at every request, you need to find a way to have it executed only once. Several techniques can be implemented to achieve that task. I prefer to rely on a public variable. At first, it doesn't exist. When this is the case, it will execute a specific function or method that will handle all the initialization. Then, you instantiate that variable which will make it visible for all other processes thus the initialization code won't be executed again.
I use West Wind Web Connection for my Web application. Once it goes into the Process() method, I have something like this:
* On the first transaction only
Considering that I open permanently approximately between 40 to 100 tables in the applications I have done, this is also another great advantage of using that approach. On the first request, the initialization will kick in, the tables will be opened, and some extra seconds will be used to do all that. However, for all incoming requests, it will be fast.
Search to be used with care
Among the most extensive CPU requests your application will handle, certainly are the search ones to be taking with care. A great feature offered to several Web sites, a search engine allows your users to extract pages/content from your Web site. However, this feature can sometimes be hard on your server CPU when big searches are being executed. Despite the fact that, most of the time, a search engine will return the result set in a few seconds, some of the searches can be quite extensive on the CPU. Implementing that on your main server is always a bad idea. When you can afford it, try to offload that task to a secondary server.
But, how exactly this should be accomplished? You would like your Web site to rely on a common entry point so you don't want to switch to another Web site when comes time to do the search. Several techniques are used for that. I prefer to use the technique of having the same entry point, but once the search kicks in, it is executed on a secondary server. The process is quite simple. Once you have collected the search parameters, your Web application could use some kind of mechanism to obtain the result set from a secondary server. This can be accomplished by the use of XML to communicate between both servers. So, the secondary server receives an XML string, which are the search parameters, execute the search and return the result set in the XML string.
This technique allows your application to handle the in and out of the HTML code but doesn't process the biggest part. This then becomes the responsibility of another server.
I hope you have been able to collect a few tips here and there. I have a lot of more to share so if this article receives some nice feedback, I'll be happy to deliver another one for a follow up.