A general purpose random number generator (RNG). Currently offers only the Ziggurat RNG due to Marsaglia and Tsang (2000), but later editions will offer additional generators. This RNG is expected to be an adequate source of random numbers for all processes that do not explicitly concern themselves with random number generation and do not require specialist random numbers. See at the bottom for a discussion of this algorithm.
The utility object provides the following C++ interface (for details of the C interface, see the associated header file).
C/C++ API (Utility)
Most of this interface is undocumented due to noone's done it yet, but hopefully it's more-or-less self-explanatory. Some functions whose operation is not obvious are detailed below. For usage examples, see the source code for any Standard Library process that uses this utility (this includes
For a usage example, see the source code for
RNG_MT2000 was originally written by George Marsaglia and Wai Wan Tsang in 2000 (published in Journal of Statistical Software, 5(8)).
Issues with Marsaglia & Tsang (2000)
Some commentary originally from Rob Stewart, though I (mitch) may have edited it I can't remember.
According to Leong et al. (2005), the MT2000 RNG failed the chi-square test for normal distributions. They proposed to fix this by adding five lines of C code (Leong et al., 2005). It's just a question of the relatively short period of the uniform RNG from MT2000, coupled with the fact that they generated 20 billion random numbers.
As I understand it, the only problem with the SHR3 inline uniform generator in the 2000 code is its relatively short period (232-1 to be precise). By using this as the foundation of the ziggurat code, the Gaussian rng also has the same short period. Now, what Leong and others did in the 2005 paper was to generate 20 billion random numbers (or roughly 5 times the period of the rng) and this was enough to break Gaussianity by the chi-square measure. This is not a very interesting result, but does highlight that if you want a longer period, you need to use a better uniform generator. What is interesting is how close the polar method and the 2005 improved method are to the 95% confidence limit in their tests. Neither method is anywhere near 99% confidence, and I would be interested to see how a MT-ziggurat generator would fare on this test. Anyone want to give it a go?
I really don't see this as a big issue in simulation, and thus continue to use the original algorithm from Marsaglia & Tsang (2000). Ultimately, you need to ask yourself whether repeats in the rng every few billion calls matters in your simulation. For example, in many of my simulations, I use RNOR to add Gaussian iid noise to each model neuron's membrane potential at each simulation time step. I also use RNOR to generate log-normal numbers for the inter-spike-intervals of extrinsic neurons, but we'll ignore that for the moment. Say I have a network of 1000 neurons, and a time step of 0.2 ms, that gives 5000 RNOR calls every simulation ms, or 5 million every second. RNOR will repeat every 859 seconds of simulation time for this network model. Do your simulations run this long? Mine don't. Of course, when you have 100,000 model neurons you start to get repeats every few seconds, but are your models that large? These are important reality checks. Also, even if your rng does repeat, will this actually introduce any periodicities into your model's behaviour? Not likely. The state of the model will be different when the repeat occurs, and the repeats will be applied to different model components. You might also want to look here:
If you were really serious about using the best uniform generator for this, then I would look into implementing the Mersenne Twister (MT19937) to generate the uniforms:
Interestingly, for version 7.3 (I think), the mathworks incorporated the mersenne twister in as an option in rand as explained here:
This is a GOOD THING, and everyone using that version of matlab should use this rng in their matlab code.
Now, the short period is perhaps not strictly the _only_ problem with the ziggurat code. Doornik (2005) showed there was another problem with the method when it was modified to produce doubles rather than floats, but I'm not sure how relevant his result is for float generation.
It looks like someone (Jochen Voss) has already put mt19937 together with the ziggurat method. Nice! (jazz man voice). It is available here:
I haven't really looked at it yet, but it's GSL-centric. Anyone got time to give it a try? Of course, for any of you that didn't know already, the ziggurat method has been the one used for matlab's randn since matlab 5:
So the mathworks trust the method, but use a version with a longer period...
Leong, P. H. W., Zhang, G., Lee, D.-U., Luk, W., & Villasenor, J. D. (2005). A comment on the implementation of the ziggurat method. Journal of Statistical Software, 12(7). URL: http://www.jstatsoft.org/v12/i07/
Marsaglia, G. & Tsang, W. W. (2000). The ziggurat method for generating random variables. Journal of Statistical Software, 5(8). URL: http://www.jstatsoft.org/v05/i08/
Doornik, J.A. (2005). An Improved Ziggurat Method to Generate Normal Random Samples. Working papers, Nuffield College, Oxford. URL: http://www.doornik.com/research.html
|This is a documentation page for the BRAHMS Modular Execution (Simulation) Framework. For details of licensing and distribution, please click here. To advise of an error or omission, please click here.|