Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Reduce to zero
Message
From
06/02/2009 11:27:00
 
 
To
05/02/2009 23:16:49
Dragan Nedeljkovich (Online)
Now officially retired
Zrenjanin, Serbia
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Title:
Environment versions
Visual FoxPro:
VFP 9 SP2
Miscellaneous
Thread ID:
01379529
Message ID:
01379820
Views:
20
Thanks All.

Just as an fyi - any individual data point for my purposes is not particularly important per se. The main objective is to smooth out spikes but also let the data fall to zero if it is genuinely meant to be zero. I have opted for a reduction of the previous value by 20%. This will reduce to zero in a finite number of steps of between 10 and 50 depending on the initial value. This is acceptable.

Thanks.


>>Jos,
>>
>>The actual root problem greatly helps! If I were approaching this "noisy data" problem, I would not alter the raw data because you have no auditability. What I'd do is store all of the raw data, gather the evidence of errors and use that to get the 3rd party to provide better data.
>>
>>At display time I think you can choose to display "filtered results", something like ignore data points that have a deviation of N sigma from the average over a sliding window of days. What this would do is smooth the curve it would also time skew it so the plots would not show large changes in value until a day or two later. This may or may not be acceptable. You have to make sure that everyone understands the effects of the data display filters going into it.
>
>Also, I don't see any need to do it in ten steps. Just take some honest approximation between the last value and zero, maybe half of last time's sigma. The task description says only that we may need to come closer to zero if it repeats - well, if it does, last value is already closer, and it has introduced some scatter (by being half a sigma below the previous value), so sigma may even increase, so it will approach zero even faster if this repeats. IOW, if the zero is a bona fide value, then it would be reached fast enough; if it's a glitch, it will be sigma/2 of an error or somewhere around that. And there's no limitation on the number of times needed to recognize a zero as a true value - it may become close enough fast enough.
>
>Running this algorithm (and varying between my proposed half sigma and other possible values) against old data may show what's most fitting for the purpose.
In the End, we will remember not the words of our enemies, but the silence of our friends - Martin Luther King, Jr.
Previous
Reply
Map
View

Click here to load this message in the networking platform