I thought I'd bang out a quick Rosetta Code task before lunch: generate a collection of normally-distributed random numbers.
MiniScript doesn't have such a function built in, but most languages don't, and everybody uses the Box-Mueller method, which involves drawing 2 random numbers in the standard (0,1) range and then passing them through a formula.
But I'm doing this, and checking the mean and standard deviation... the mean is coming out fine (close to 0), but the standard deviation (which should be 1) is coming out close to 1/3 instead. I have no idea what's going on! Can anyone solve this mystery?
Here is a snippet that shows my randNormal function, along with some code to calculate the mean and standard deviation of a list of numbers. I check those functions on a couple of problems small enough to hand calculate, and they come up with the right answers on those. But on my sample of 1000 randNormal calls, I get a standard deviation close to 0.33 instead of something close to 1.
randNormal = function(mean=0, stddev=1)
u = rnd
v = rnd
// Box-Muller method:
return mean + u * sqrt(-2 * log(u,2.7182818284)) * cos(2*pi*v) * stddev
end function
It's as if the Box-Muller method does not work... but everybody seems to think it does. Help!