Technology
Probability of Two Independent Random Variables Both Having a Gaussian Distribution
Probability of Two Independent Random Variables Both Having a Gaussian Distribution
Understanding Probability Distributions
When discussing probabilities, it is essential to specify a way to define or construct a probability distribution for these phenomena. This involves understanding the generative process by which probabilities are chosen and applied.
Generative Process and Probability Distributions
One of the most widely used distributions in statistics is the Gaussian distribution. However, to understand the probability of two independent random variables both having a Gaussian distribution, we must first explore the generative process that defines such distributions.
Pearson Family of Distributions: The Pearson family is a broad set of probability distributions characterized by parameters such as squared skewness ((beta_1 geq 0)) and traditional (excess) kurtosis ((beta_2 geq 1)), subjected to the constraint (beta_2 geq beta_1 1).
These parameters can be generated from any non-degenerate continuous distribution. For example, we could limit the parameter space to the triangle (0 leq beta_1 leq 1), (0 leq beta_2 leq kappa), and (beta_2 geq beta_1 1) for some (kappa geq 1), and use a continuous uniform distribution over this triangle.
Alternatively, if (beta_1) is distributed exponentially, and (beta_2 (beta_1 1)theta), with (theta) being an independent exponential distribution, this method provides another way to define a continuous prior for (beta_1) and (beta_2).
By constructing such a continuous distribution, the probability of a random variable being perfectly Gaussian (i.e., corresponding to the single point (0, 3) in the Pearson parameter space) is zero. This probability depends significantly on the underlying generative process.
Generative Process and Probability Interdependence
Knowing that two random variables (RVs) are independent does not provide much information about whether both have a Gaussian distribution. This is crucial because the independence of RVs does not imply that their distributions are similar or even Gaussian.
Case 1: Independent and Identically Distributed RVs: An example of a generative process where the first RV is chosen from a common distribution, and all subsequent RVs are also independent and identically distributed (iid) would result in the probability of two independent RVs both having a Gaussian distribution being identical to the probability of the first RV being Gaussian.
Case 2: Different Distributions: If the generative process picks a distribution for the first RV from the Pearson parameter space and for the second RV ensures it is different, the probability of both RVs being Gaussian would necessarily be zero by construction. Here, even if the RVs are independent, the selection process guarantees distinct distributions.
Assumptions and Probability Calculation
To discuss the probability of two random variables having a particular property, one must assume a valid generative process that results in random variables. This process allows us to specify and study the probabilities of arriving at the random variables we are interested in.
By defining clear generative processes, we can explore and calculate the probabilities of various outcomes, including the likelihood of two independent RVs both having a Gaussian distribution. This involves understanding the distribution chosen for each variable and how that distribution is applied through the generative process.