Quantcast
Channel: LabVIEW topics
Viewing all articles
Browse latest Browse all 66684

Statistics on the fly

$
0
0

I have 200+ channels coming in continuously, at a frame rate of 10 Hz.

 

Client requires an AVERAGE value of each channel, between a START time and a STOP time, possibly several minutes apart.

 

What I do is set a COUNT to 0, and clear a 200-chan SUM[ ] buffer at START time.

 

For each sample, if the averaging is now in progress, I add the current sample[ ] to the SUM[ ] buffer and increment the count.

 

Sometime after the STOP time the average is required.

 

So I take the SUM[ ] and divide by N, and that's the average for 200 channels.

 

The only memory required is for 200 channels, regardless of the duration.  I don't need to keep evary sample around.

 

That works just fine.

 

Now the client wants to add MIN, MAX, StdDev, and Variance to the list of stats needed.

 

MIN and MAX are easy: I just compare each sample[ ] to the existing MIN[ ] and MAX[ ] arrays and keep the smaller and larger.

 

But the definition of variance is SUM(Xi - Mean)^2 / N.   (StdDev is the square root of that).

 

Doesn't that mean that I have to have every single sample in hand when it's time to compute it?

 

I can't process X(i) - Mean until I know what the MEAN is, but I don't know that until the end..

 

Any way to avoid storing every single sample?

 


Viewing all articles
Browse latest Browse all 66684

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>