Login
In Cooperation with:
American Society for Quality Statistics Division
American Statistical Association
Bernoulli Society for Mathematical Statistics and Probability
Institute of Mathematical Statistics
International Biometric Society
International Chinese Statistical Association
International Society for Bayesian Analysis
International Statistical Institute
Royal Statistical Society
Statistical Society of Canada / Société statistique du Canada
 Edit Page
 
 History
 
 Unwatch
 
 Suggest Correction 
 Comment 
Generating Random Variables
ref
Generating Random VariablesChristian Robert^{1} George Casella^{2} Keywords and Phrases: Random Number Generator, Probability Integral Transform, Accept Reject Algorithm, MetroplisHastings Algorithm, Gibbs Sampling, Markov Chains
The methods described in this article mostly rely on the possibility of producing (with a computer) a supposedly endless flow of random variables (usually iid) for wellknown distributions. Such a simulation is, in turn, based on the production of uniform random variables. There are many ways in which uniform pseudorandom numbers can be generated. For example there is the Kiss algorithm of Marsaglia and Zaman (1993); details on other random number generators can be found in the books of Rubinstein (1981), Ripley (1987), Fishman (1996), and Knuth (1998).

Mixture Distributions
In a mixture representation a density has the form
(continuous)  
(discrete). 
We then can simulate as
 1.
 , , or
 2.
 ,
AcceptReject Methods
Another class of methods only requires the form of the density of interest  called the target density. We simulate from a density , called the candidate density.
Given a target density , we need a candidate density and a constant such that
on the support of .
AcceptReject Algorithm:
To produce a variable distributed according to :
 1.
 Generate , Uniform ;
 2.
 Accept if ;
 3.
 Otherwise, return to 1.
Notes:
 (a).
 The densities and need be known only up to a multiplicative factor.
 (b).
 The probability of acceptance is , when evaluated for the normalized densities.
Envelope AcceptReject Algorithm:
If the target density is difficult to evaluate, the Envelope AcceptReject Algorithm (called the squeeze principle by Marsaglia 1977) may be appropriate.If there exist a density , a function and a constant such that
then the algorithm 1.
 Generate , Uniform ;
 2.
 Accept if ;
 3.
 Otherwise, accept if
 4.
 Otherwise, return to 1.
Markov Chain Methods
Every simulation method discussed thus far has produced independent random variables whose distribution is exactly the target distribution. In contrast, Markov chain methods produce a sequence of dependent random variables whose distribution converges to the target. Their advantage is their applicability in complex situations.
Recall that a sequence of random variables is a Markov chain if, for any , the conditional distribution of given is the same as the distribution of given ; that is,
Metropolis  Hastings Algorithm
The MetropolisHastings (MH) algorithm (Metropolis et al. 1953, Hastings 1970) associated with the target density and the candidate density produces a Markov chain through
Metropolis Hastings
Given ,
 Generate .
 Take where
 Then converges in distribution to .
Notes:
 (a).
 For we have the independent MH algorithm, and for we have a symmetric MH algorithm, where does not depend on . Also, , symmetric around zero, is a random walk MH algorithm.
 (b).
 Like the AcceptReject method, the Metropolis  Hastings algorithm only requires knowing and up to normalizing constants.
The Gibbs Sampler
For , write the random variable as , where the 's are either uni or multidimensional, with conditional distributions
for . The associated Gibbs sampler is given by the following transition from to :The Gibbs sampler
Given , generate 1.
 ;
 2.
 ,
 p.
 ,
Notes:
 (a).
 The densities are called the full univariate conditionals.
 (b).
 Even in a high dimensional problem, all of the simulations can be univariate.
 (c).
 The Gibbs sampler is, formally, a special case of the MH algorithm (see Robert and Casella 2004, Section ) but with acceptance rate equal to .
The Slice Sampler
A particular version of the Gibbs sampler, called the slice sampler (Besag and Green 1993, Damien et al. 1999), can sometimes be useful. Write where the 's are positive functions, not necessarily densitiesSlice sampler
Simulate 1.
 Uniform ;
 k.
 Uniform ;
 k+1.
 Uniform , with
Application
In this section we give some guidelines for simulating different distributions. See Robert and Casella (2004, 2010) for more detailed explanations, and Devroye (1986) for more algorithms. In what follows, is Uniform unless otherwise specified.Arcsine
.Beta
, , , where Gamma and Gamma , independent.Cauchy
, , Cauchy .If uniform , then Cauchy . Also, Cauchy , where , independent.
Chi squared( )
, .Double exponential
, , , . If Exponential , then attaching a random sign ( if , otherwise) gives Double exponential , and Double exponential . This is also known as the Laplace distribution.Exponential
, Exponential .Extreme Value
. If Exponential , Extreme Value . This is also known as the Gumbel distribution.F Distribution
, If and , independent,Gamma
, , . Then Gamma , an integerIf is not an integer, indirect methods can be used. For example, to generate a Gamma use Algorithm 3.1 or 4.1 with candidate distribution Gamma , with and , where is the greatest integer less than . For the AcceptReject algorithm the bound on the normalized is . There are many other efficient algorithms.
Note:
If Gamma then Gamma . Some special cases are Exponential = gamma , and Chi squared = gamma . Also, has the inverted (or inverse) gamma distribution.Logistic
, , , .Lognormal
, , . If Normal .
Noncentral chi squared ,
, .
Normal
, , , . The BoxMuller algorithm simulates two normals from two uniforms:
There are many other ways to generate normal random variables.
 (a).
 Accept Reject using Cauchy When and , densities of the normal and Cauchy distributions, respectively, then .
 (b).
 Accept Reject using double exponentialWhen and , .
 (c).

Slice Sampler
uniform uniformyields .
Pareto ,
, ,Student's
, , .Uniform
, ,Weibull
, , , .References
 Besag, J. and Green, P.J. (1993) Spatial statistics and Bayesian computation (with discussion). J. Roy. Statist. Soc. B 55, 2538.
 Damien, P., Wakefield, J. and Walker, S. (1999) Gibbs sampling for Bayesian nonconjugate and hierarchical models by using auxiliary variables. J. Roy. Statist. Soc. B 61(2), 331344.
 Devroye, L. (1986) NonUniform Random Variate Generation. New York: SpringVerlag.
 Fishman, G.S. (1996) Monte Carlo. New York: SpringerVerlag
 Gilks, W.R. and Wild, P. (1992) Adaptive rejection sampling for Gibbs sampling. Appl. Statist. 41, 337348.
 Hastings, W. (1970). Monte Carlo Sampling Methods Using Markov Chains and their Application. Biometrika 57 97109.
 Knuth, D. E. (1998). The Art of Computer Programming, Volume 2/ Seminumerical Algorithms, Third Edition. Addison Wesley, Reading.
 Marsaglia, G. (1977) The squeeze method for generating gamma variables. Computers and Mathematics with Applications, 3, 321325.
 Marsaglia, G. and Zaman, A. (1991). A new class of random number generators. Annals Applied Probability 1, pp. 462480.
 Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H., Teller, E. (1953) Equations of state calculations by fast computing machines. J. Chem. Phys. 21, 10871092.
 Ripley, B.D. (1987) Stochastic Simulation. New York: John Wiley
 Robert, C. P. and Casella, G. (2004). Monte Carlo Statistical Methods, Second Edition. New York: SpringVerlag.
 Robert, C. P. and Casella, G. (2010). An Introduction to Monte Carlo Methods with R. New York: SpringVerlag.
 Rubinstein, R.Y. (1981) Simulation and the Monte Carlo Method. New York: John Wiley
Footnotes
 ... Robert^{1}
 Université Paris Dauphine, 75785 Paris cedex 16, France, and CREST, INSEE. Email: xian@ceremade.dauphine.fr
 ... Casella^{2}
 Department of Statistics and Genetics Institute, University of Florida, Gainesville, FL 32611. Email: casella@ufl.edu