David, would you care to hazard a guess at a number at which point using #DEFINE might start making a difference on modern cpu's? iow, at what point might a programmer start worrying about this.
>Walter,
>
>>1. Actually it might bite you as well. E.g:
>>
#DEFINE MYSTRING "This is a very long string of a few hundred characters long ....."
>>IF you use this instead of a variable, it ends up as many times you use the contant, consuming more memory than when using variables.
>
>That consumes more memory, it doesn't consume memvars. And yes, I'd call the above #define abuse. *g*
>
>>2. As for runtime execution speed. I don't think that anyone would take that seriously for only a very few performance critical routines. You can of course prove the difference, but how many value has this have in practise
>
>Whether or not it's a "measurable" amount of time depends on how many constant variables are being set up. If the couple of thousand Word constant memvars are all being created. And yes correspondingly huge .h files do take longer to compile, but the enduser doesn't experience that time just us poor slob programmers.
>
>>, while about every VFP programmer has difficulty to get their database performance over the network under control.
>
>Certainly there are other more time consuming things that go on in our apps.
In the End, we will remember not the words of our enemies, but the silence of our friends - Martin Luther King, Jr.