Charlie,
>10ms is 1\100th of a second. If I set timer to 10ms, then in timer event increment counter by .01 seconds each method call, then 100 ticks will be a second. This seems to get me real close. I can then adjust interval prop to calibrate, but not sure how would be affected by changing CPU load.
Your timer would need to be set to 5ms in order to decently detect a 10ms value.
>Sampling? Is this the same as sampling an analog waveform in digital recording where highest frequency sampled is half sampling frequency?
Yes, in fact to get accurate sampling it needs to occur about 10 times as fast as the frequency of the signal being measured.