10ms is 1\100th of a second. If I set timer to 10ms, then in timer event increment counter by .01 seconds each method call, then 100 ticks will be a second. This seems to get me real close. I can then adjust interval prop to calibrate, but not sure how would be affected by changing CPU load.
Sampling? Is this the same as sampling an analog waveform in digital recording where highest frequency sampled is half sampling frequency?
Charlie
Précédent
Suivant
Répondre
Voir le fil de ce thread
Voir le fil de ce thread à partir de ce message seulement
Voir tous les messages de ce thread
Voir tous les messages de ce thread à partir de ce message seulement