10ms is 1\100th of a second. If I set timer to 10ms, then in timer event increment counter by .01 seconds each method call, then 100 ticks will be a second. This seems to get me real close. I can then adjust interval prop to calibrate, but not sure how would be affected by changing CPU load.
Sampling? Is this the same as sampling an analog waveform in digital recording where highest frequency sampled is half sampling frequency?
Charlie
Previous
Next
Reply
View the map of this thread
View the map of this thread starting from this message only
View all messages of this thread
View all messages of this thread starting from this message only