import math
var data = @[3.0,4.0,5.0,6.0,7.0]
proc populationVariance(data:seq):float=
let n :int = len(data)
let m = (n-1)/n
variance(data) / m
echo "Data : " , data
echo "Variance (Nim) : " , variance(data)
echo "Variance (R) : " , populationVariance(data)
echo "Mean : " , mean(data)
echo "Std. Dev. (R) : " , sqrt(populationVariance(data))
echo "Std. Dev. (Nim) : " , standardDeviation(data)
the last line fails with error:
Error: type mismatch: got (seq[float]) but expected one of: math.standardDeviation(s: RunningStat)
I am on linux with latest 0.10.3
Um... ok? What exactly is wrong?
If you look at the math module's documentation You'll notice there is no 'standardDeviation' procedure that takes a sequence of integers, only one that takes a RunningStat (Which is something of an oversight). Please open a pull request or issue if you want to include a standard deviation that takes a sequence.