Charlie,
You should just use the solution offered earlier. set a timer custom property to seconds() and then in the timer event use the delta between the current seconds() value and the saved value to get the total elapsed time. This will run the same on all speed machines.
>Thanx for getting back to me. I tried the C++ timer that was suggested, but it was only capable of 20ms resolution, while the Fox timer was capable of greater than 10ms resolution. This is becuase C++ timer sits outside VFP runtime in an FLL. I was thinking of possibly setting up some sort of external calibration routine to adjust timer interval to different CPU's.