>>We are not running across a network, just from the local hard drive but because we have been gradually scaling up, the could be a possibility in the near future.
>
>Unfortunately, it seems that my comments sometimes
don't apply on the local drive, only over the network. Full Optimization, on a local drive and with relatively few records, may actually be worthwhile. I remember testing some years ago; queries were actually
faster with the index on deleted() (which I now avoid). The situation may change quite a bit, though, with hundreds of thousands of records (instead of, say, one thousand), or over a network.
>
>>As you have stated though, the data is changing all the time. Is the penalty that I am getting now about .1 to .2 worth constantly monitoring the data basically doing what Oracle or SQL Server would do - Kind of a crude statistics. I don't know.
>
>The Database Server has to do statistics. On the other hand, you, as a programmer, can do some intelligent guessing in advance: what data do you plan to place in a field? Of course, some statistics every now and then, to check your initial guess, may be useful. The following query counts how many times each value of KeyField is used in SampleTable:
>
>
>select KeyField, count(*) as TimesUsed;
> from SampleTable;
> group by KeyField;
> order by TimesUsed descending
>
>
>>As to your comment about Oracle, I'm not sure why you would ever use rule-based optimization unless the overhead for keeping statistics outweighs the benefits.
>
>Yes, I understand cost-based optimization is the preferred method. But I don't actually use Oracle - I only did some reading. Therefore, I wouldn't know for sure.
>
>Hilmar.
Thanks for the help. You actually opened my eyes to some new possibilities as far as optimization.